0. 引言

神经网络中的注意机制就是参考人类的视觉注意机制原理。即人眼在聚焦视野区域中某个小区域时,会投入更多的注意力到这个区域,即以“高分辨率”聚焦于图像的某个区域,同时以“低分辨率”感知周围图像,然后随着时间的推移调整焦点。

参考文献:

  1. [arxiv] - .attention search
  2. [CV] - Mnih V, Heess N, Graves A. Recurrent models of visual attention[J]. arXiv preprint arXiv:1406.6247, 2014.
  3. [Bahdanau] - Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate[J]. arXiv preprint arXiv:1409.0473, 2014.
  4. [CV] - Ba J, Mnih V, Kavukcuoglu K. Multiple object recognition with visual attention[J]. arXiv preprint arXiv:1412.7755, 2014.
  5. [CV] - Xu K, Ba J, Kiros R, et al. Show, attend and tell: Neural image caption generation with visual attention[J] .arXiv preprint arXiv:1502.03044, 2015.
  6. [Speech] - Chorowski J K, Bahdanau D, Serdyuk D, et al. Attention-based models for speech recognition[J]. arXiv preprint arXiv:1506.07503, 2015.
  7. [Luong ] - Luong M T, Pham H, Manning C D. Effective approaches to attention-based neural machine translation[J]. arXiv preprint arXiv:1508.04025, 2015.
  8. [Speech] - Bahdanau D, Chorowski J, Serdyuk D, et al. End-to-end attention-based large vocabulary speech recognition[J]. arXiv preprint arXiv:1508.04395, 2015.
  9. [QA] - Yang Z, He X, Gao J, et al. Stacked attention networks for image question answering[J]. arXiv preprint arXiv:1511.02274, 2015.
  10. [Weight normalization] - Salimans T, Kingma D P. Weight normalization: A simple reparameterization to accelerate training of deep neural networks[J]. arXiv preprint arXiv:1602.07868, 2016.
  11. [Text] - Nallapati R, Xiang B, Zhou B. Sequence-to-Sequence RNNs for Text Summarization[J]. 2016.
  12. [Survey] - Wang F, Tax D M J. Survey on the attention based RNN model and its applications in computer vision[J]. arXiv preprint arXiv:1601.06823, 2016.
  13. [Translation] - Wu Y, Schuster M, Chen Z, et al. Google's neural machine translation system: Bridging the gap between human and machine translation[J]. arXiv preprint arXiv:1609.08144, 2016.
  14. [Translation] - Neubig G. Neural Machine Translation and Sequence-to-sequence Models: A Tutorial[J]. arXiv preprint arXiv:1703.01619, 2017.
  15. [BahdanauMonotonic] - Raffel C, Luong T, Liu P J, et al. Online and linear-time attention by enforcing monotonic alignments[J]. arXiv preprint arXiv:1704.00784, 2017.
  16. [survey] - .Vaswani A, Shazeer N, Parmar N, et al. Attention Is All You Need[J].arXiv preprint arXiv:1706.03762v4, 2017.
  17. [Blog] - .Attention and Augmented Recurrent Neural Networks
  18. [Quora] - .How-does-an-attention-mechanism-work-in-deep-learning
  19. [Quora] - .Can-you-recommend-to-me-an-exhaustive-reading-list-for-attention-models-in-deep-learning
  20. [Quora] - .What-is-attention-in-the-context-of-deep-learning
  21. [Quora] - .What-is-an-intuitive-explanation-for-how-attention-works-in-deep-learning
  22. [Quora] - .What-is-exactly-the-attention-mechanism-introduced-to-RNN-recurrent-neural-network-It-would-be-nice-if-you-could-make-it-easy-to-understand
  23. [Quora] - .How-is-a-saliency-map-generated-when-training-recurrent-neural-networks-with-soft-attention
  24. [Quora] - .What-is-the-difference-between-soft-attention-and-hard-attention-in-neural-networks
  25. [Quora] - .What-is-Attention-Mechanism-in-Neural-Networks
  26. [Quora] - .How-is-the-attention-component-of-attentional-neural-networks-trained

Attention[Content]的更多相关文章

  1. Attention and Augmented Recurrent Neural Networks

    Attention and Augmented Recurrent Neural Networks CHRIS OLAHGoogle Brain SHAN CARTERGoogle Brain Sep ...

  2. The 4 Essentials of Video Content Marketing Success

    https://www.entrepreneur.com/article/243208 As videos become increasingly popular, they provide the ...

  3. A Model of Saliency-Based Visual Attention for Rapid Scene Analysis

    A Model of Saliency-Based Visual Attention for Rapid Scene Analysis 题目:A Model of Saliency-Based Vis ...

  4. 《Attention is All You Need》

    https://www.jianshu.com/p/25fc600de9fb 谷歌最近的一篇BERT取得了卓越的效果,为了研究BERT的论文,我先找出了<Attention is All You ...

  5. (zhuan) Attention in Long Short-Term Memory Recurrent Neural Networks

    Attention in Long Short-Term Memory Recurrent Neural Networks by Jason Brownlee on June 30, 2017 in  ...

  6. The Attention Merchants

    Title: The Attention Merchants (书评类文章) <注意力商人> attention 注意力 merchant 商人(零售商,强调通过贩卖物品获取利益)  bu ...

  7. content is king – Bill Gates (1/3/1996) 内容为王 - 比尔盖茨

    以下中文版本由谷歌翻译 内容为王 - 比尔盖茨(1/3/1996) 内容是我期望在互联网上赚取大部分真钱的地方,就像在广播中一样. 半个世纪前开始的电视革命催生了许多行业,包括制造电视机,但长期的赢家 ...

  8. considerate|considerable|content|Contact|Consult|deceived|

    ADJ-GRADED 替人着想的;体贴的Someone who is considerate pays attention to the needs, wishes, or feelings of o ...

  9. 谣言检测(ClaHi-GAT)《Rumor Detection on Twitter with Claim-Guided Hierarchical Graph Attention Networks》

    论文信息 论文标题:Rumor Detection on Twitter with Claim-Guided Hierarchical Graph Attention Networks论文作者:Erx ...

随机推荐

  1. PATH、CLASSPATH、CLASSPATH

    PATH: 说明: 环境变量中的path,意在在向计算机发出指令时的一个指向路径,如 一般会在path里加上:%JAVA_HOME%\bin;%JAVA_HOME%\jre\bin 其中:%JAVA_ ...

  2. CSS图片两端对齐,自适应列表布局末行对齐修复实例页面

    写在前面 前端开发,图片两端对齐,是十分常见的,也是十分痛苦的,我试过好多方法,通过整理,认为下面还是比较靠谱的,在实践中大家可以试试,欢迎一起学习,一起进步 HTML代码 HTML代码非常简单,用的 ...

  3. 小程序和PHP学习笔记 ----- 不定期更新。

    学习tp5和小程序过程需要记住的重点记录 1,box-sizing: border-box; 规定两个并排的带边框的框 border-box 为元素设定的宽度和高度决定了元素的边框盒. 就是说,为元素 ...

  4. cf1130E. Wrong Answer(构造)

    题意 题目链接 Sol 对构造一无所知... 题解的方法比较神仙,,设第一个位置为\(-1\),\(S = \sum_{i=1}^n a_i\) 那么我们要让\(N * S - (N - 1) * ( ...

  5. CSS的引入方式及CSS选择器

    一 CSS介绍 现在的互联网前端分三层: a.HTML:超文本标记语言.从语义的角度描述页面结构. b.CSS:层叠样式表.从审美的角度负责页面样式. c.JS:JavaScript .从交互的角度描 ...

  6. Echarts简单案例

    官网: http://echarts.baidu.com/index.html 文档:  http://echarts.baidu.com/echarts2/doc/doc.html <html ...

  7. JMeter 配置元件之计数器Counter

    配置元件之计数器Counter   by:授客 QQ:1033553122 测试环境 apache-jmeter-2.13 1.   计数器简介 允许用户创建一个在线程组范围之内都可以被引用的计数器. ...

  8. JMeter http(s)测试脚本录制器的使用

    JMeter http(s)测试脚本录制器的使用 by:授客 QQ:1033553122 http(s) Test Script Recorder允许Jmeter在你使用普通浏览器浏览web应用时,拦 ...

  9. Android系统启动流程(二)解析Zygote进程启动过程

    1.Zygote简介 在Android系统中,DVM(Dalvik虚拟机).应用程序进程以及运行系统的关键服务的SystemServer进程都是由Zygote进程来创建的,我们也将它称为孵化器.它通过 ...

  10. recovery 恢复出厂设置失败Data wipe failed

    最近客户反馈,编译32位的android系统,在recovery中执行恢复出厂设置的时候失败了,失败的打印提升信息如下. Formatting /data... [ 2.191404] E:get_f ...