RNN and LSTM saliency Predection Scene Label
http://handong1587.github.io/deep_learning/2015/10/09/rnn-and-lstm.html //RNN and LSTM
http://handong1587.github.io/deep_learning/2015/10/09/saliency-prediction.html //saliency Predection
http://handong1587.github.io/deep_learning/2015/10/09/scene-labeling.html //Scene Label
RNN and LSTM
Published: 09 Oct 2015 Category: deep_learning
Types of RNN
1) Plain Tanh Recurrent Nerual Networks
2) Gated Recurrent Neural Networks (GRU)
3) Long Short-Term Memory (LSTM)
Tutorials
A Beginner’s Guide to Recurrent Networks and LSTMs
http://deeplearning4j.org/lstm.html
A Deep Dive into Recurrent Neural Nets
http://nikhilbuduma.com/2015/01/11/a-deep-dive-into-recurrent-neural-networks/
Long Short-Term Memory: Tutorial on LSTM Recurrent Networks
http://people.idsia.ch/~juergen/lstm/index.htm
LSTM implementation explained
http://apaszke.github.io/lstm-explained.html
Recurrent Neural Networks Tutorial
- Part 1(Introduction to RNNs): http://www.wildml.com/2015/09/recurrent-neural-networks-tutorial-part-1-introduction-to-rnns/
- Part 2(Implementing a RNN using Python and Theano):http://www.wildml.com/2015/09/recurrent-neural-networks-tutorial-part-2-implementing-a-language-model-rnn-with-python-numpy-and-theano/
- Part 3(Understanding the Backpropagation Through Time (BPTT) algorithm):http://www.wildml.com/2015/10/recurrent-neural-networks-tutorial-part-3-backpropagation-through-time-and-vanishing-gradients/
- Part 4(Implementing a GRU/LSTM RNN): http://www.wildml.com/2015/10/recurrent-neural-network-tutorial-part-4-implementing-a-grulstm-rnn-with-python-and-theano/
Understanding LSTM Networks
- blog: http://colah.github.io/posts/2015-08-Understanding-LSTMs/
- ZH: http://www.jianshu.com/p/9dc9f41f0b29
Recurrent Neural Networks in DL4J
http://deeplearning4j.org/usingrnns.html
Train RNN
A Simple Way to Initialize Recurrent Networks of Rectified Linear Units
- arxiv: http://arxiv.org/abs/1504.00941
- gitxiv: http://gitxiv.com/posts/7j5JXvP3kn5Jf8Waj/irnn-experiment-with-pixel-by-pixel-sequential-mnist
- github: https://github.com/fchollet/keras/blob/master/examples/mnist_irnn.py
- github: https://gist.github.com/GabrielPereyra/353499f2e6e407883b32
- blog(“Implementing Recurrent Neural Net using chainer!”): http://t-satoshi.blogspot.jp/2015/06/implementing-recurrent-neural-net-using.html
- reddit:https://www.reddit.com/r/MachineLearning/comments/31rinf/150400941_a_simple_way_to_initialize_recurrent/
- reddit:https://www.reddit.com/r/MachineLearning/comments/32tgvw/has_anyone_been_able_to_reproduce_the_results_in/
Sequence Level Training with Recurrent Neural Networks
- arxiv: http://arxiv.org/abs/1511.06732
- notes: https://www.evernote.com/shard/s189/sh/ada01a82-70a9-48d4-985c-20492ab91e84/8da92be19e704996dc2b929473abed46
Papers
Generating Sequences With Recurrent Neural Networks
- arxiv: http://arxiv.org/abs/1308.0850
- github: https://github.com/hardmaru/write-rnn-tensorflow
- blog: http://blog.otoro.net/2015/12/12/handwriting-generation-demo-in-tensorflow/
DRAW: A Recurrent Neural Network For Image Generation
- arXiv: http://arxiv.org/abs/1502.04623
- github: https://github.com/vivanov879/draw
- github(Theano): https://github.com/jbornschein/draw
- github(Lasagne): https://github.com/skaae/lasagne-draw
Unsupervised Learning of Video Representations using LSTMs(ICML2015)
- project: http://www.cs.toronto.edu/~nitish/unsupervised_video/
- paper: http://arxiv.org/abs/1502.04681
- code: http://www.cs.toronto.edu/~nitish/unsupervised_video/unsup_video_lstm.tar.gz
- github: https://github.com/emansim/unsupervised-videos
LSTM: A Search Space Odyssey
- paper: http://arxiv.org/abs/1503.04069
- notes: https://www.evernote.com/shard/s189/sh/48da42c5-8106-4f0d-b835-c203466bfac4/50d7a3c9a961aefd937fae3eebc6f540
- blog(“Dissecting the LSTM”): https://medium.com/jim-fleming/implementing-lstm-a-search-space-odyssey-7d50c3bacf93#.crg8pztop
- github: https://github.com/jimfleming/lstm_search
Inferring Algorithmic Patterns with Stack-Augmented Recurrent Nets
A Critical Review of Recurrent Neural Networks for Sequence Learning
- arXiv: http://arxiv.org/abs/1506.00019
- intro: “A rigorous & readable review on RNNs”
http://blog.terminal.com/a-thorough-and-readable-review-on-rnns/
Scheduled Sampling for
Sequence Prediction with Recurrent Neural Networks(Winner of MSCOCO image
captioning challenge, 2015)
Visualizing and
Understanding Recurrent Networks(Andrej Karpathy, Justin Johnson, Fei-Fei Li)
- paper: http://arxiv.org/abs/1506.02078
- slides:http://www.robots.ox.ac.uk/~seminars/seminars/Extra/2015_07_06_AndrejKarpathy.pdf
Grid Long Short-Term
Memory
- arxiv: http://arxiv.org/abs/1507.01526
- github(Torch7): https://github.com/coreylynch/grid-lstm/
Depth-Gated LSTM
Deep Knowledge Tracing
- paper: https://web.stanford.edu/~cpiech/bio/papers/deepKnowledgeTracing.pdf
- github: https://github.com/chrispiech/DeepKnowledgeTracing
Top-down Tree Long
Short-Term Memory Networks
Alternative structures
for character-level RNNs(INRIA & Facebook AI Research)
- arXiv: http://arxiv.org/abs/1511.06303
- github: https://github.com/facebook/Conditional-character-based-RNN
Pixel Recurrent Neural
Networks (Google DeepMind)
- arxiv: http://arxiv.org/abs/1601.06759
- notes(by Hugo Larochelle): https://www.evernote.com/shard/s189/sh/fdf61a28-f4b6-491b-bef1-f3e148185b18/aba21367d1b3730d9334ed91d3250848
Long Short-Term
Memory-Networks for Machine Reading
Lipreading with Long
Short-Term Memory
Associative Long
Short-Term Memory
Representation of
linguistic form and function in recurrent neural networks
Architectural
Complexity Measures of Recurrent Neural Networks
Easy-First Dependency
Parsing with Hierarchical Tree LSTMs
Training Input-Output
Recurrent Neural Networks through Spectral Methods
Learn To Execute Programs
Learning to Execute
Neural
Programmer-Interpreters (Google DeepMind)
- arXiv: http://arxiv.org/abs/1511.06279
- project page: http://www-personal.umich.edu/~reedscot/iclr_project.html
A
Programmer-Interpreter Neural Network Architecture for Prefrontal Cognitive
Control
Convolutional RNN: an
Enhanced Model for Extracting Features from Sequential Data
Attention Models
Recurrent Models of
Visual Attention (Google
DeepMind. NIPS2014)
- paper: http://arxiv.org/abs/1406.6247
- data: https://github.com/deepmind/mnist-cluttered
- code: https://github.com/Element-Research/rnn/blob/master/examples/recurrent-visual-attention.lua
Recurrent Model of
Visual Attention(Google DeepMind)
- paper: http://arxiv.org/abs/1406.6247
- GitXiv: http://gitxiv.com/posts/ZEobCXSh23DE8a8mo/recurrent-models-of-visual-attention
- blog: http://torch.ch/blog/2015/09/21/rmva.html
- code: https://github.com/Element-Research/rnn/blob/master/scripts/evaluate-rva.lua
Show, Attend and Tell:
Neural Image Caption Generation with Visual Attention
A Neural Attention
Model for Abstractive Sentence Summarization(EMNLP 2015. Facebook AI Research)
- arXiv: http://arxiv.org/abs/1509.00685
- github: https://github.com/facebook/NAMAS
Effective Approaches
to Attention-based Neural Machine Translation(EMNLP2015)
Generating Images from
Captions with Attention
- arxiv: http://arxiv.org/abs/1511.02793
- github: https://github.com/emansim/text2image
- demo: http://www.cs.toronto.edu/~emansim/cap2im.html
Attention and Memory
in Deep Learning and NLP
Survey on the
attention based RNN model and its applications in computer vision
Train RNN
Training Recurrent
Neural Networks (PhD thesis)
- atuhor: Ilya Sutskever
- thesis: https://www.cs.utoronto.ca/~ilya/pubs/ilya_sutskever_phd_thesis.pdf
Deep learning for
control using augmented Hessian-free optimization
- blog: https://studywolf.wordpress.com/2016/04/04/deep-learning-for-control-using-augmented-hessian-free-optimization/
- github: https://github.com/studywolf/blog/blob/master/train_AHF/train_hf.py
Hierarchical Conflict
Propagation: Sequence Learning in a Recurrent Deep Neural Network
Recurrent Batch
Normalization
Optimizing Performance
of Recurrent Neural Networks on GPUs
- arxiv: http://arxiv.org/abs/1604.01946
- github: https://github.com/parallel-forall/code-samples/blob/master/posts/rnn/LSTM.cu
Codes
NeuralTalk
(Deprecated): a Python+numpy project for learning Multimodal Recurrent Neural
Networks that describe images with sentences
NeuralTalk2: Efficient
Image Captioning code in Torch, runs on GPU
char-rnn in Blocks
Project:
pycaffe-recurrent
Using neural networks
for password cracking
- blog: https://0day.work/using-neural-networks-for-password-cracking/
- github: https://github.com/gehaxelt/RNN-Passwords
Recurrent neural
networks for decoding CAPTCHAS
- blog: https://deepmlblog.wordpress.com/2016/01/12/recurrent-neural-networks-for-decoding-captchas/
- demo: http://simplecaptcha.sourceforge.net/
- code: http://sourceforge.net/projects/simplecaptcha/
torch-rnn: Efficient,
reusable RNNs and LSTMs for torch
Deploying a model
trained with GPU in Torch into JavaScript, for everyone to use
- blog: http://testuggine.ninja/blog/torch-conversion
- demo: http://testuggine.ninja/DRUMPF-9000/
- github: https://github.com/Darktex/char-rnn
LSTM implementation on
Caffe
Blog
Survey on
Attention-based Models Applied in NLP
http://yanran.li/peppypapers/2015/10/07/survey-attention-model-1.html
Survey on Advanced
Attention-based Models
http://yanran.li/peppypapers/2015/10/07/survey-attention-model-2.html
Online Representation
Learning in Recurrent Neural Language Models
http://www.marekrei.com/blog/online-representation-learning-in-recurrent-neural-language-models/
Fun with Recurrent
Neural Nets: One More Dive into CNTK and TensorFlow
Materials to
understand LSTM
https://medium.com/@shiyan/materials-to-understand-lstm-34387d6454c1#.4mt3bzoau
Understanding LSTM and
its diagrams (★★★★★)
- blog: https://medium.com/@shiyan/understanding-lstm-and-its-diagrams-37e2f46f1714
- slides: https://github.com/shi-yan/FreeWill/blob/master/Docs/Diagrams/lstm_diagram.pptx
Persistent RNNs: 30
times faster RNN layers at small mini-batch sizes (Greg Diamos, Baidu Silicon
Valley AI Lab)
http://svail.github.io/persistent_rnns/
All of Recurrent
Neural Networks
https://medium.com/@jianqiangma/all-about-recurrent-neural-networks-9e5ae2936f6e#.q4s02elqg
Resources
Awesome Recurrent
Neural Networks - A curated list of resources dedicated to RNN
- homepage: http://jiwonkim.org/awesome-rnn/
- github: https://github.com/kjw0612/awesome-rnn
Jürgen Schmidhuber’s
page on Recurrent Neural Networks
http://people.idsia.ch/~juergen/rnn.html
Reading and
Questions
Are there any
Recurrent convolutional neural network network implementations out there ?
« Reinforcement LearningSaliency Prediction »
Saliency Prediction
This task involves predicting the salient regions of an image given by human eye fixations.
Large-scale optimization of hierarchical features for saliency prediction in natural images
Predicting Eye Fixations using Convolutional Neural Networks
DeepFix: A Fully Convolutional Neural Network for predicting Human Eye Fixations
DeepSaliency: Multi-Task Deep Neural Network Model for Salient Object Detection
SuperCNN: A Superpixelwise Convolutional Neural Network for Salient Object Detection
Shallow and Deep Convolutional Networks for Saliency Prediction
Scene Labeling
Papers
Learning hierarchical features for scene labeling
- intro: “Their approach comprised of densely computing multi-scale CNN features for each pixel and aggregating them over image regions upon which they are classified. However, their methodstill required the post-processing step of generating over-segmented regions, like superpixels, for obtaining the final segmentation result. Additionally, the CNNs used for multi-scale feature learning were not very deep with only three convolution layers.”
- paper: http://yann.lecun.com/exdb/publis/pdf/farabet-pami-13.pdf
Indoor Semantic Segmentation using depth information
Multi-modal unsupervised feature learning for rgb-d scene labeling
Using neon for Scene Recognition: Mini-Places2
- intro: This is an implementation of the deep residual network used for Mini-Places2 as described in He et. al., “Deep Residual Learning for Image Recognition”.
- blog: http://www.nervanasys.com/using-neon-for-scene-recognition-mini-places2/
- github: https://github.com/hunterlang/mpmz
Attend, Infer, Repeat: Fast Scene Understanding with Generative Models
Challenges
Large-scale Scene Understanding Challenge
- homepage: http://lsun.cs.princeton.edu/
RNN and LSTM saliency Predection Scene Label的更多相关文章
- RNN和LSTM
一.RNN 全称为Recurrent Neural Network,意为循环神经网络,用于处理序列数据. 序列数据是指在不同时间点上收集到的数据,反映了某一事物.现象等随时间的变化状态或程度.即数据之 ...
- RNN、LSTM、Seq2Seq、Attention、Teacher forcing、Skip thought模型总结
RNN RNN的发源: 单层的神经网络(只有一个细胞,f(wx+b),只有输入,没有输出和hidden state) 多个神经细胞(增加细胞个数和hidden state,hidden是f(wx+b) ...
- RNN 与 LSTM 的应用
之前已经介绍过关于 Recurrent Neural Nnetwork 与 Long Short-Trem Memory 的网络结构与参数求解算法( 递归神经网络(Recurrent Neural N ...
- Naive RNN vs LSTM vs GRU
0 Recurrent Neural Network 1 Naive RNN 2 LSTM peephole Naive RNN vs LSTM 记忆更新部分的操作,Naive RNN为乘法,LSTM ...
- TensorFlow之RNN:堆叠RNN、LSTM、GRU及双向LSTM
RNN(Recurrent Neural Networks,循环神经网络)是一种具有短期记忆能力的神经网络模型,可以处理任意长度的序列,在自然语言处理中的应用非常广泛,比如机器翻译.文本生成.问答系统 ...
- 浅谈RNN、LSTM + Kreas实现及应用
本文主要针对RNN与LSTM的结构及其原理进行详细的介绍,了解什么是RNN,RNN的1对N.N对1的结构,什么是LSTM,以及LSTM中的三门(input.ouput.forget),后续将利用深度学 ...
- 3. RNN神经网络-LSTM模型结构
1. RNN神经网络模型原理 2. RNN神经网络模型的不同结构 3. RNN神经网络-LSTM模型结构 1. 前言 之前我们对RNN模型做了总结.由于RNN也有梯度消失的问题,因此很难处理长序列的数 ...
- RNN以及LSTM的介绍和公式梳理
前言 好久没用正儿八经地写博客了,csdn居然也有了markdown的编辑器了,最近花了不少时间看RNN以及LSTM的论文,在组内『夜校』分享过了,再在这里总结一下发出来吧,按照我讲解的思路,理解RN ...
- 深度学习:浅谈RNN、LSTM+Kreas实现与应用
主要针对RNN与LSTM的结构及其原理进行详细的介绍,了解什么是RNN,RNN的1对N.N对1的结构,什么是LSTM,以及LSTM中的三门(input.ouput.forget),后续将利用深度学习框 ...
随机推荐
- FileUploadInterceptor拦截器的笔记
当请求表单中包含一个文件file,FileUploadInterception拦截器会自动应用于这个文件. 表单: <s:form namespace="/xxx" acti ...
- (转)JPEG图片数据结构分析- 附Png数据格式详解.doc
一.简述 JPEG是一个压缩标准,又可分为标准JPEG.渐进式JPEG及JPEG2000三种: ①标准JPEG:以24位颜色存储单个光栅图像,是与平台无关的格式,支持最高级别的压缩,不过,这种压 ...
- 解决:新版火狐浏览器3d打不开
重启:按 Ctrl + Shift + L 键唤出 3d 视图 参考文档:http://tieba.baidu.com/p/4606488108
- Linux下如何修改ip地址
在Linux的系统下如何才能修改IP信息 以前总是用ifconfig修改,重启后总是得重做.如果修改配置文件,就不用那么麻烦了- A.修改ip地址 即时生效: # ifconfig eth0 192. ...
- javascript URI的编码
用encodeURIComponent,但是不清楚她和encodeURI的区别, w3school 对其的解释: encodeURIComponent() 函数可把字符串作为 URI 组件进行编码.( ...
- EXT学习之——获取下拉框combobox的值与显示名
//申请科室 var comboboxdept = new Ext.form.ComboBox({ xtype: "combobox", name: "Gender&qu ...
- 关于页面 reflow 和 repaint
什么是 reflow 和 repaint 浏览器为了重新渲染部分或整个页面,重新计算页面元素位置和几何结构(geometries)的进程叫做 reflow. 当确定了元素位置.大小以及其他属性,例如颜 ...
- WebStorage 和 Cookie的区别
sessionStorage 和 localStorage 是HTML5 Web Storage API 提供的,可以方便的在web请求之间保存数据.有了本地数据,就可以避免数据在浏览器和服务器间不必 ...
- 自定义UITableViewCell
随着日常的使用,系统提供的cell已经不能满足开发的需要,因为系统提供的是单一的,所以 这就引来了自定义cell的出现,可以根据 自己的需要来布局各个控件所处的位置.不同位置显示不同的控件. 创建一个 ...
- Flume NG简介及配置
Flume下载地址:http://apache.fayea.com/flume/ 常用的分布式日志收集系统: Apache Flume. Facebook Scribe. Apache Chukwa ...