(zhuan) Recurrent Neural Network
Recurrent Neural Network
2016年07月01日 Deep learning Deep learning 字数:24235
References
Baisc
Improvements
- 20170326 Learning Simpler Language Models with the Delta Recurrent Neural Network Framework
- 20170316 Machine Learning on Sequential Data Using a Recurrent Weighted Average
- 20161029 Phased LSTM Accelerating Recurrent Network Training for Long or Event based Sequences
- 20161020 Using Fast Weights to Attend to the Recent Past
- 20161017 Interactive Attention for Neural Machine Translation
- 20160908 LSTM GRU Highway and a Bit of Attention An Empirical Overview for Language Modeling in Speech Recognition
- 20160811 Recurrent Highway Networks
- 20160721 Layer Normalization
- 20160713 Recurrent Memory Array Structures
- 20160524 Sequential Neural Models with Stochastic Layers
- 20160513 LSTM with Working Memory
- 20160412 Recurrent Batch Normalization
- 20160209 Associative Long Short-Term Memory
- 20151214 Memory-based control with recurrent neura networks
- 20151105 Quasi-Recurrent Neural Networks
- 20150503 ReNet A Recurrent Neural Network Based Alternative to Convolutional Networks
- 20150331 End-To-End Memory Networks
- 20150209 Gated Feedback Recurrent Networks
- 20150204 Spatial Transformer Networks
- 20140908 Recurrent Neural Network Regularization
Image captioning
- 20170330 Speaking the Same Language: Matching Machine to Human Captions by Adversarial Training
- 20170218 MAT A Multimodal Attentive Translator for Image Captioning
- 20161214 Optimization of image description metrics using policy gradient methods
- 20161212 Text guided Attention Model for Image Captioning
- 20161205 Recurrent Image Captioner Describing Images with Spatial-Invariant Transformation and Attention Filtering
- 20161203 Areas of Attention for Image Captioning
- 20160809 Towards cross-lingual distributed representations without parallel text trained with adversarial autoencoders
- 20160706 Sort Story Sorting Jumbled Images and Captions into Stories
- 20160701 Domain Adaptation for Neural Networks by Parameter Augmentation
- 20160620 Variational Autoencoder for Deep Learning of Images Labels and Captions
- 20160615 Image Caption Generation with Text-Conditional Semantic Attention
- 20160607 Encode Review and Decode Reviewer Module for Caption Generation
- 20160605 Multimodal Residual Learning for Visual QA
- 20160531 Attention Correctness in Neural Image Captioning
- 20160512 Movie Description
- 20160503 Improving Image Captioning by Concept-based Sentence Reranking
- 20160501 Delving Deeper into Convolutional Networks for Learning Video Representations
- 20160419 Show Attend and Tell Neural Image Caption Generation with Visual Attention
- 20160406 Improving LSTM-based Video Descriptionwith Linguistic Knowledge Mined from Text
- 20160404 Image Captioning with Deep Bidirectional LSTMs
- 20160330 Rich Image Captioning in the Wild
- 20160330 Dense Image Representation with Spatial Pyramid VLAD Coding of CNN for Locally Robust Captioning
- 20160328 Generating Visual Explanations
- 20160328 Attend Infer Repeat:Fast Scene Understanding with Generative Models
- 20160324 A Diagram Is Worth A Dozen Images
- 20160301 ORDER-EMBEDDINGS OFIMAGES AND LANGUAGE
- 20160228 Generating Visual Explanations
- 20151117 Deep Compositional Captioning Describing Novel Object Categories without Paired Training Data
- 20151111 Deep Multimodal Semantic Embeddings for Speech and Images
- 20151109 Generating Images From Captions With Attention
- 20151027 Learning Deep Representations of Fine-Grained Visual-Descriptions
- 20151026 Video Paragraph Captioning using Hierarchical RecurrentNeuralNetworks
- 20151013 Summarization based Video Caption via DeepNeuralNetworks
- 20150916 Guiding Long-Short Term Memory for Image Caption Generation
- 20150829 Multimodal Convolutional Neural Networks for Matching Image and Sentence
- 20150604 The Long Short Story of Movie Description
- 20150604 Jointly Modeling Embedding and Translation to Bridge Video and Language
- 20150420 Show and Tell A Neural Image Caption Generator
- 20150414 Deep Visual-Semantic Alignments for Generating Image Descriptions
- 20150303 Sequence to Sequence Video to Text
- 20150227 Describing Videos by Exploiting Temporal Structure
- 20150212 Phrase based Image Captioning
- 20150210 Show Attend and Tell Neural Image Caption Generation with Visual Attention
Image generation
- 20170411 A neural representation of sketch drawings
- 20161214 VAE vs GAN
- 20160819 Pixel Recurrent Neural Networks
- 20160726 Semantic Image Inpainting with Perceptual and Contextual Losses
- 20160629 Towards Conceptual Compression
- 20160619 Generating Images Part by Part with Composite Generative Adversarial Networks
- 20160616 Conditional Image Generation with PixelCNN Decoders
- 20160610 Improved Techniques for Training GANs
- 20160610 Deep Directed Generative Models with Energy-Based Probability Estimation
- 20160605 Generative Adversarial Text to Image Synthesis
- 20160529 Generating images with recurrent adversarial networks
- 20160526 Domain-Adversarial Training of Neural Networks
- 20160526 Adversarial Autoencoders
- 20160320 Segmentation from Natural Language Expressions
- 20160229 Generating Images from Captions with Attention
- 20151119 Unsupervised Learning of Visual Structure using Predictive Generative Networks
- 20151109 Generating Images From Captions With Attention
- 20150216 DRAW A Recurrent Neural Network For Image Generation
- 20140610 Generative Adversarial Networks
Visual question answering
- 20160926 The Color of the Cat is Gray 1 Million Full Sentences Visual Question Answering
- 20160620 DualNet Domain-Invariant Network for Visual Question Answering
- 20160612 Training Recurrent Answering Units with Joint Loss Minimization for VQA
- 20160605 Multimodal Residual Learning for Visual QA
- 20160509 Ask Your Neurons A Deep Learning Approach to Visual Question Answering
- 20160504 Leveraging Visual Question Answering for Image-Caption Ranking
- 20160420 Question Answering via Integer Programming over Semi-Structured Knowledge
- 20160406 A Focused Dynamic Attention Model for VQA
- 20160319 Generating Natural Questions About an Image
- 20160309 Image Captioning and Visual QuestionAnswering Based on Attributes and TheirRelated External Knowledge
- 20160304 Dynamic Memory Networks for Visual and Textual Question Answering
- 20160208 Visualizing and Understanding Neural Models in NLP
- 20160202 Where To Look Focus Regions for Visual Question Answering
- 20151209 MovieQA Understanding Stories in Movies through Question-Answering
- 20151123 Where to look Focus regions for visual question answering
- 20151118 Learning to Answer Questions From Image Using Convolutional Neural Network
- 20151118 Compositional Memory for Visual Question Answering
- 20151118 An attention based convolutional neural network for visual question answering
- 20151117 Ask, Attend and Answer Exploring question-guided spatial attention for visual question answering
- 20151112 LSTM-based Deep Learning Models for Non-factoid Answer Selection
- 20151111 Visual7W Grounded Question Answering in Images
- 20151109 Explicit Knowledge-based Reasoning for Visual Question Answering
- 20151107 Stacked attention networks for image question answering
- 20151107 Simple Baseline for Visual Question Answering
- 20150521 Are you talking to a machine? dataset and methods for multilingual image question answering
- 20150508 Exploring Models and Data for Image Question-Answering
- 20150508 Exploring Models and Data for Image Question Answering
- 20150505 Ask Your Neurons A Neural-based Approach to Answering Questions about Images
- 20150503 VQA Visual Question Answering
Natural language processing
- 20170330 Speaking the Same Language: Matching Machine to Human Captions by Adversarial Training
- 20170324 Sequence-to-Sequence Models Can Directly Transcribe Foreign Speech
- 20170302 Controllable Text generation
- 20170208 A Hybrid Convolutional Variational Autoencoder for Text Generation
- 20161220 Hierarchical Softmax
- 20160818 Full Resolution Image Compression with Recurrent Neural Networks
- 20160803 Learning Online Alignments with Continuous Rewards Policy Gradient
- 20160803 Dependency-based Convolutional Neural Networks
- 20160726 An Actor-Critic Algorithm for Sequence Prediction
- 20160722 Syntax-based Attention Model for Natural Language Inference
- 20160718 Neural Machine Translation with Recurrent Attention Modeling
- 20160715 Neural Tree Indexers for Text Understanding
- 20160715 Neural Machine Translation with
- Recurrent Attention Modeling
- 20160715 Attention-over-Attention Neural Networks for Reading Comprehension
- 20160710 CHARAGRAM Embedding Words and Sentences via Character n-grams
- 20160705 Chains of Reasoning over Entities Relations and Text using Recurrent Neural Networks
- 20160703 快进连接改进Attention机制 深度学习提升机器翻译效果
- 20160621 Topic Augmented Neural Response Generation with a Joint Attention Mechanism
- 20160615 The Enemy in Your Own Camp How Well Can We Detect Statistically-Generated Fake Reviews–An Adversarial Study
- 20160613 Attention-based Multimodal Neural Machine Translation
- 20160607 Memory-enhanced Decoder for Neural Machine Translation
- 20160606 Adversarial Deep Averaging Networks for Cross-Lingual Sentiment Classification
- 20160606 A Decomposable Attention Model for Natural Language Inference
- 20160605 Deep Reinforcement Learning for Dialogue Generation
- 20160524 Hierarchical Memory Networks
- 20160524 Combining Recurrent and Convolutional Neural Networks for Relation Classification
- 20160509 Parse tree
- 20160428 Crafting Adversarial Input Sequences for Recurrent Neural Networks
- 20160407 Sentence Level Recurrent Topic Model Letting Topics Speak for Themselves
- 20160406 A Recurrent Latent Variable Model for Sequential Data
- 20160404 Achieving Open Vocabulary Neural Machine Translation with Hybrid Word-Character Models
- 20160401 Building Machines That Learn and Think Like People
- 20160323 Latent Predictor Networks for Code Generation
- 20160322 Fully Convolutional Attention Localization Networks Efficient AttentionLocalization for Fine-Grained Recognition
- 20160221 Learning Semantic Representations using Relations
- 20160219 Contextual LSTM (CLSTM) models for Large scale NLP tasks
- 20160208 Efficient Algorithms for Adversarial Contextual Learning
- 20160207 Exploring the Limits of Language Modeling
- 20160206 WebNav A New Large-Scale Task for Natural Language based Sequential Decision Making
- 20160206 Recurrent Memory Network for Language Modeling
- 20160206 Multi-Way Multilingual Neural Machine Translation with a Shared Attention Mechanism
- 20160201 Efficient Character-level Document Classification by Combining Convolution and Recurrent Layers
- 20160110 Strategies for Training Large Vocabulary Neural Language Models
- 20151227 Learning Document Embeddings by Predicting N-grams for Sentiment Classification of Long Movie Reviews
- 20151215 Increasing the Action Gap New Operators for Reinforcement Learning
- 20151203 Target-Dependent Sentiment Classification with Long Short Term Memory
- 20151203 Neural Enquirer Learning to Query Tables in Natural Language
- 20151201 Multilingual Language Processing From Bytes
- 20151119 Multi-task Sequence to Sequence Learning
- 20151119 Alternative structures for character-level RNNs
- 20151111 Larger-Context Language Modeling
- 20151104 Semi-supervised Sequence Learning
- 20151101 A Unified Tagging Solution Bidirectional LSTM Recurrent Neural Network with Word Embedding
- 20151031 Top down Tree LSTM Networks
- 20151029 Attention with Intention for a Neural Network Conversation Model
- 20151026 Thinking on your Feet Reinforcement Learning for Incremental Language Tasks
- 20151013 A Sensitivity Analysis of Convolutional Neural Networks for Sentence Classification
- 20151011 A Diversity-Promoting Objective Function for Neural Conversation Models
- 20150902 A Neural Attention Model for Abstractive Sentence Summarization
- 20150822 Towards Neural Network-based Reasoning
- 20150726 Improved Semantic Representations From Tree-Structured Long Short TermMemory Networks
- 20150629 Document Embedding with Paragraph Vectors
- 20150626 On Using Very Large Target Vocabulary for Neural Machine Translation
- 20150622 Skip-Thought Vectors
- 20150619 Deep Knowledge Tracing
- 20150619 A Neural Conversational Model
- 20150617 Building End-To-End Dialogue Systems Using Generative Hierarchical Neural Network Models
- 20150610 Teaching Machines to Read and Comprehend
- 20150609 Scheduled sampling for sequence prediction with recurrent neural networks
- 20150531 A Neural Network Approach to Context Sensitive Generation of Conversational Responses
- 20150427 Neural Responding Machine for Short-Text Conversation
- 20150205 Character-level Convolutional Networks for Text Classification
- 20141223 Grammar as a Foreign Langauage
- 20141017 Learning to Execute
- 20140910 Sequence to Sequence Learning with Neural Networks
- 20140903 On the Properties of Neural Machine Translation Encoder-Decoder Approaches
- 20140901 Neural Machine Translation by Jointly Learning to Align and Translate
- 20140624 Recurrent Models of Visual Attention
- 20140603 Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation
- 20140516 Distributed Representations of Sentences and Documents
- 20050909 METEOR An Automatic Metric for MT Evaluation with Improved Correlation with Human Judgments
- 20010917 Bleu a method for automatic evaluation of machine translation
(zhuan) Recurrent Neural Network的更多相关文章
- Recurrent Neural Network系列1--RNN(循环神经网络)概述
作者:zhbzz2007 出处:http://www.cnblogs.com/zhbzz2007 欢迎转载,也请保留这段声明.谢谢! 本文翻译自 RECURRENT NEURAL NETWORKS T ...
- Recurrent Neural Network(循环神经网络)
Reference: Alex Graves的[Supervised Sequence Labelling with RecurrentNeural Networks] Alex是RNN最著名变种 ...
- Recurrent Neural Network系列2--利用Python,Theano实现RNN
作者:zhbzz2007 出处:http://www.cnblogs.com/zhbzz2007 欢迎转载,也请保留这段声明.谢谢! 本文翻译自 RECURRENT NEURAL NETWORKS T ...
- Recurrent Neural Network系列3--理解RNN的BPTT算法和梯度消失
作者:zhbzz2007 出处:http://www.cnblogs.com/zhbzz2007 欢迎转载,也请保留这段声明.谢谢! 这是RNN教程的第三部分. 在前面的教程中,我们从头实现了一个循环 ...
- Recurrent Neural Network系列4--利用Python,Theano实现GRU或LSTM
yi作者:zhbzz2007 出处:http://www.cnblogs.com/zhbzz2007 欢迎转载,也请保留这段声明.谢谢! 本文翻译自 RECURRENT NEURAL NETWORK ...
- 循环神经网络(Recurrent Neural Network,RNN)
为什么使用序列模型(sequence model)?标准的全连接神经网络(fully connected neural network)处理序列会有两个问题:1)全连接神经网络输入层和输出层长度固定, ...
- Recurrent Neural Network[Content]
下面的RNN,LSTM,GRU模型图来自这里 简单的综述 1. RNN 图1.1 标准RNN模型的结构 2. BiRNN 3. LSTM 图3.1 LSTM模型的结构 4. Clockwork RNN ...
- Recurrent Neural Network[survey]
0.引言 我们发现传统的(如前向网络等)非循环的NN都是假设样本之间无依赖关系(至少时间和顺序上是无依赖关系),而许多学习任务却都涉及到处理序列数据,如image captioning,speech ...
- 【NLP】Recurrent Neural Network and Language Models
0. Overview What is language models? A time series prediction problem. It assigns a probility to a s ...
随机推荐
- sitecore系统教程之内容创作入门
在Sitecore中,有两种编辑工具,您可以在其中创建和编辑网站上的内容: 内容编辑器 - 专为熟悉Sitecore及其包含的功能的经验丰富的内容作者而设计的应用程序. 体验编辑器 - 一种直观的编辑 ...
- jQuery选择器--selector1,selector2,selectorN和ancestor descendant
selector1,selector2,selectorN 概述 将每一个选择器匹配到的元素合并后一起返回.你可以指定任意多个选择器,并将匹配到的元素合并到一个结果内 参数 selector1 ...
- Python学习记录之----网络通信(二)
网络通信 socket 这一节太难了,还是看TA的吧 http://www.cnblogs.com/alex3714/articles/5830365.html 不能执行top等类似的 会持续输出 ...
- GJP_Project
1. view层作用: 视图层,即项目中的界面 l controller层作用: 控制层, 获取界面上的数据,为界面设置数据; 将要实现的功能交给业务层处理 l service层作用: 业务层, ...
- js如何获取服务器端时间?
用js做时间校正,获取本机时间,是存在bug的. 使用js也可获取到服务器时间,原理是使用 ajax请求,返回的头部信息就含有服务器端的时间信息,获取到就可以了.以下: 1.依赖jQuery 代码: ...
- java内存泄漏与内存溢出
https://www.cnblogs.com/panxuejun/p/5883044.html 内存溢出 out of memory,是指程序在申请内存时,没有足够的内存空间供其使用,出现out o ...
- Cent Linux启动tomcat慢的问题
Tomcat7的session id的生成主要通过java.security.SecureRandom生成随机数来实现,随机数算法使用的是”SHA1PRNG”. 是因为一个JDK一个bug,在这个bu ...
- Java笔记 #04# 类的初始化顺序补充
参考java中的类的初始化顺序详解 package org.sample; class Bread { Bread() { System.out.println("Bread()" ...
- Java axis2.jar包详解及缺少jar包错误分析
Java axis2.jar包详解及缺少jar包错误分析 一.最小开发jar集 axis2 开发最小jar包集: activation-1.1.jar axiom-api-1.2.13.jar ax ...
- Linux - TCP编程相关配置2
100万并发连接服务器笔记之处理端口数量受限问题 第二个遇到的问题:端口数量受限 一般来说,单独对外提供请求的服务不用考虑端口数量问题,监听某一个端口即可.但是向提供代理服务器,就不得不考虑端口数量受 ...