Attention[Content]
0. 引言
神经网络中的注意机制就是参考人类的视觉注意机制原理。即人眼在聚焦视野区域中某个小区域时,会投入更多的注意力到这个区域,即以“高分辨率”聚焦于图像的某个区域,同时以“低分辨率”感知周围图像,然后随着时间的推移调整焦点。
参考文献:
- [arxiv] - .attention search
- [CV] - Mnih V, Heess N, Graves A. Recurrent models of visual attention[J]. arXiv preprint arXiv:1406.6247, 2014.
- [Bahdanau] - Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate[J]. arXiv preprint arXiv:1409.0473, 2014.
- [CV] - Ba J, Mnih V, Kavukcuoglu K. Multiple object recognition with visual attention[J]. arXiv preprint arXiv:1412.7755, 2014.
- [CV] - Xu K, Ba J, Kiros R, et al. Show, attend and tell: Neural image caption generation with visual attention[J] .arXiv preprint arXiv:1502.03044, 2015.
- [Speech] - Chorowski J K, Bahdanau D, Serdyuk D, et al. Attention-based models for speech recognition[J]. arXiv preprint arXiv:1506.07503, 2015.
- [Luong ] - Luong M T, Pham H, Manning C D. Effective approaches to attention-based neural machine translation[J]. arXiv preprint arXiv:1508.04025, 2015.
- [Speech] - Bahdanau D, Chorowski J, Serdyuk D, et al. End-to-end attention-based large vocabulary speech recognition[J]. arXiv preprint arXiv:1508.04395, 2015.
- [QA] - Yang Z, He X, Gao J, et al. Stacked attention networks for image question answering[J]. arXiv preprint arXiv:1511.02274, 2015.
- [Weight normalization] - Salimans T, Kingma D P. Weight normalization: A simple reparameterization to accelerate training of deep neural networks[J]. arXiv preprint arXiv:1602.07868, 2016.
- [Text] - Nallapati R, Xiang B, Zhou B. Sequence-to-Sequence RNNs for Text Summarization[J]. 2016.
- [Survey] - Wang F, Tax D M J. Survey on the attention based RNN model and its applications in computer vision[J]. arXiv preprint arXiv:1601.06823, 2016.
- [Translation] - Wu Y, Schuster M, Chen Z, et al. Google's neural machine translation system: Bridging the gap between human and machine translation[J]. arXiv preprint arXiv:1609.08144, 2016.
- [Translation] - Neubig G. Neural Machine Translation and Sequence-to-sequence Models: A Tutorial[J]. arXiv preprint arXiv:1703.01619, 2017.
- [BahdanauMonotonic] - Raffel C, Luong T, Liu P J, et al. Online and linear-time attention by enforcing monotonic alignments[J]. arXiv preprint arXiv:1704.00784, 2017.
- [survey] - .Vaswani A, Shazeer N, Parmar N, et al. Attention Is All You Need[J].arXiv preprint arXiv:1706.03762v4, 2017.
- [Blog] - .Attention and Augmented Recurrent Neural Networks
- [Quora] - .How-does-an-attention-mechanism-work-in-deep-learning
- [Quora] - .Can-you-recommend-to-me-an-exhaustive-reading-list-for-attention-models-in-deep-learning
- [Quora] - .What-is-attention-in-the-context-of-deep-learning
- [Quora] - .What-is-an-intuitive-explanation-for-how-attention-works-in-deep-learning
- [Quora] - .What-is-exactly-the-attention-mechanism-introduced-to-RNN-recurrent-neural-network-It-would-be-nice-if-you-could-make-it-easy-to-understand
- [Quora] - .How-is-a-saliency-map-generated-when-training-recurrent-neural-networks-with-soft-attention
- [Quora] - .What-is-the-difference-between-soft-attention-and-hard-attention-in-neural-networks
- [Quora] - .What-is-Attention-Mechanism-in-Neural-Networks
- [Quora] - .How-is-the-attention-component-of-attentional-neural-networks-trained
Attention[Content]的更多相关文章
- Attention and Augmented Recurrent Neural Networks
Attention and Augmented Recurrent Neural Networks CHRIS OLAHGoogle Brain SHAN CARTERGoogle Brain Sep ...
- The 4 Essentials of Video Content Marketing Success
https://www.entrepreneur.com/article/243208 As videos become increasingly popular, they provide the ...
- A Model of Saliency-Based Visual Attention for Rapid Scene Analysis
A Model of Saliency-Based Visual Attention for Rapid Scene Analysis 题目:A Model of Saliency-Based Vis ...
- 《Attention is All You Need》
https://www.jianshu.com/p/25fc600de9fb 谷歌最近的一篇BERT取得了卓越的效果,为了研究BERT的论文,我先找出了<Attention is All You ...
- (zhuan) Attention in Long Short-Term Memory Recurrent Neural Networks
Attention in Long Short-Term Memory Recurrent Neural Networks by Jason Brownlee on June 30, 2017 in ...
- The Attention Merchants
Title: The Attention Merchants (书评类文章) <注意力商人> attention 注意力 merchant 商人(零售商,强调通过贩卖物品获取利益) bu ...
- content is king – Bill Gates (1/3/1996) 内容为王 - 比尔盖茨
以下中文版本由谷歌翻译 内容为王 - 比尔盖茨(1/3/1996) 内容是我期望在互联网上赚取大部分真钱的地方,就像在广播中一样. 半个世纪前开始的电视革命催生了许多行业,包括制造电视机,但长期的赢家 ...
- considerate|considerable|content|Contact|Consult|deceived|
ADJ-GRADED 替人着想的;体贴的Someone who is considerate pays attention to the needs, wishes, or feelings of o ...
- 谣言检测(ClaHi-GAT)《Rumor Detection on Twitter with Claim-Guided Hierarchical Graph Attention Networks》
论文信息 论文标题:Rumor Detection on Twitter with Claim-Guided Hierarchical Graph Attention Networks论文作者:Erx ...
随机推荐
- 如何在服务器上搭建svn
svn现在依然是一个流行的版本控制工具,但是大多数的人员只会使用客户端,并且也知道可以进行权限控制, 那么今天,我们就来给大家梳理一下 1.如何搭建svn的服务器 2.如何进行svn的权限控制 === ...
- 自定义控件详解(二):Path类 相关用法
Path:路径 绘制路径:void drawPath (Path path, Paint paint) Path 可以绘制的路径 一.直线路径 1.基本方法 void moveTo (float st ...
- off by null 实战
前言 off by null 是一个比较有意思的技术 下面通过 hctf2018 的 heapstrom_zero 实战一波. 相关文件(exp, 题目)位于 https://gitee.com/ha ...
- UML类图关系图解
一.类结构 在类的UML图中,使用长方形描述一个类的主要构成,长方形垂直地分为三层,以此放置类的名称.属性和方法. 其中, 一般类的类名用正常字体粗体表示,如上图:抽象类名用斜体字粗体,如User:接 ...
- [cb]SceneView 获取鼠标位置
扩展需求 在Scene视图中获取鼠标的位置 Demo 在Scene视图中,当鼠标点击时实例化一个Cube 重点部分 实现代码 using UnityEngine; using UnityEditor; ...
- MongoDB的安装与python操作MongoDB
一.安装MongoDB 因为我个人使用的是windows,就只记录下windows下的安装 1.下载安装 就是官网,下载msi,选个路径安装 2.配置 看见别的地方说需要手动在bin同级目录创建dat ...
- 【PAT】B1050 螺旋矩阵(25 分)
实在不觉得递归等方式有什么简单的地方,没错我就是用的最笨的方法模拟. 和我一样的小白看代码应该很容易理解. #include<stdio.h> #include<math.h> ...
- 学习CGLIB与JDK动态代理的区别
动态代理 代理模式是Java中常见的一种模式.代理又分为静态代理和动态代理.静态代理就是显式指定的代理,静态代理的优点是由程序员自行指定代理类并进行编译和运行,缺点是一个代理类只能对一个接口的实现类进 ...
- Unity RGBA16 + Dither
游戏开发中有些场合,ETC或者说PVRTC压缩质量不满足的情况下,RGBA32(原图)对美术而言肯定可以满足的,但是RGBA32是不管是对内存占用内存太厉害. RGBA16/RGB16会减少内存的占用 ...
- 【日常开发】使用多种工具实现 sql查询没有结果的name
本文地址 分享提纲: 1. 事情的背景 2. 解决办法 3. 总结 1. 事情的背景 现在需要将2000条数据的name,从user表中查询出来结果,sql 这样写 SELECT * FROM use ...