Local Response Normalization 60 million parameters and 500,000 neurons
CNN是工具,在图像识别中是发现图像中待识别对象的特征的工具,是剔除对识别结果无用信息的工具。
ImageNet Classification with Deep Convolutional Neural Networks
http://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks
http://caffe.berkeleyvision.org/tutorial/layers/lrn.html
【侧抑制】
The local response normalization layer performs a kind of “lateral inhibition” by normalizing over local input regions.
https://prateekvjoshi.com/2016/04/05/what-is-local-response-normalization-in-convolutional-neural-networks/
Why do we need normalization layers in the first place?
A typical CNN consists of the following layers: convolution, pooling, rectified linear unit (ReLU), fully connected, and loss. If the previous sentence didn’t make sense, you may want to go through a quick CNN tutorial before proceeding further. Anyway, the reason we may want to have normalization layers in our CNN is that we want to have some kind of inhibition scheme.
In neurobiology, there is a concept called “lateral inhibition”. Now what does that mean? This refers to the capacity of an excited neuron to subdue its neighbors. We basically want a significant peak so that we have a form of local maxima. This tends to create a contrast in that area, hence increasing the sensory perception. Increasing the sensory perception is a good thing! We want to have the same thing in our CNNs.
What exactly is Local Response Normalization?
Local Response Normalization (LRN) layer implements the lateral inhibition we were talking about in the previous section. This layer is useful when we are dealing with ReLU neurons. Why is that? Because ReLU neurons have unbounded activations and we need LRN to normalize that. We want to detect high frequency features with a large response. If we normalize around the local neighborhood of the excited neuron, it becomes even more sensitive as compared to its neighbors.
At the same time, it will dampen the responses that are uniformly large in any given local neighborhood. If all the values are large, then normalizing those values will diminish all of them. So basically we want to encourage some kind of inhibition and boost the neurons with relatively larger activations. This has been discussed nicely in Section 3.3 of the original paper by Krizhevsky et al.
Local Response Normalization 60 million parameters and 500,000 neurons的更多相关文章
- 局部响应归一化(Local Response Normalization,LRN)
版权声明:本文为博主原创文章,欢迎转载,注明地址. https://blog.csdn.net/program_developer/article/details/79430119 一.LRN技术介 ...
- caffe中的Local Response Normalization (LRN)有什么用,和激活函数区别
http://stats.stackexchange.com/questions/145768/importance-of-local-response-normalization-in-cnn ca ...
- Local Response Normalization作用——对局部神经元的活动创建竞争机制,使得其中响应比较大的值变得相对更大,并抑制其他反馈较小的神经元,增强了模型的泛化能力
AlexNet将LeNet的思想发扬光大,把CNN的基本原理应用到了很深很宽的网络中.AlexNet主要使用到的新技术点如下. (1)成功使用ReLU作为CNN的激活函数,并验证其效果在较深的网络超过 ...
- LRN(local response normalization--局部响应标准化)
LRN全称为Local Response Normalization,即局部响应归一化层,LRN函数类似DROPOUT和数据增强作为relu激励之后防止数据过拟合而提出的一种处理方法.这个函数很少使用 ...
- springmvc 使用 response 的注意事项以及解决500 空指针异常找不到 response 的方法
使用注解方式在类中(Controller)来装载request时,是可以正常使用request的(必须在启动时才注入,所以不支持热部署),但是同样使用这种方式在已经装载了 request的情况下装载 ...
- AlexNet论文翻译-ImageNet Classification with Deep Convolutional Neural Networks
ImageNet Classification with Deep Convolutional Neural Networks 深度卷积神经网络的ImageNet分类 Alex Krizhevsky ...
- 1 - ImageNet Classification with Deep Convolutional Neural Network (阅读翻译)
ImageNet Classification with Deep Convolutional Neural Network 利用深度卷积神经网络进行ImageNet分类 Abstract We tr ...
- 002-ImageNetClassificationDeep2017
ImageNet classification with deep convolutional neural networks #paper 1. paper-info 1.1 Metadata Au ...
- [CS231n-CNN] Training Neural Networks Part 1 : activation functions, weight initialization, gradient flow, batch normalization | babysitting the learning process, hyperparameter optimization
课程主页:http://cs231n.stanford.edu/ Introduction to neural networks -Training Neural Network ________ ...
随机推荐
- js 判断变量是否为空
js 判断变量是否为空 欢迎指正,补充! /** * 判断变量是否为空, * @param {[type]} param 变量 * @return {Boolean} 为空返回true,否则返回fal ...
- Day 18 函数之一
函数参数: 1.形参变量只有在被调用时才分配内存单元,在调用结束时,即刻释放所分配的内存单元.因此,形参只在函数内部有效.函数调用结束返回主调用函数后则不能再使用该形参变量 2.实参可以是常量.变量. ...
- AC日记——Super Mario hdu 4417
Super Mario Time Limit: 2000/1000 MS (Java/Others) Memory Limit: 32768/32768 K (Java/Others)Total ...
- AC日记——Dylans loves tree hdu 5274
Dylans loves tree Time Limit: 2000/1000 MS (Java/Others) Memory Limit: 131072/131072 K (Java/Othe ...
- PHP实现自定义中奖和概率算法
最近玩<QQ飞车手游>,出了一款点券A车,需要消耗抽奖券抽奖,甚是激动,于是抽了几次,没想到中的都是垃圾道具,可恨可叹~~ 这几天项目中也涉及到了类似的概率操作,于是思考了一下,简单分装了 ...
- 一、git clone
一.git clone $ git clone <版本库的网址> //该命令会在本地主机生成一个目录,与远程主机的版本库同名 $ git clone <版本库的网址> < ...
- [Inside HotSpot] Java分代堆
[Inside HotSpot] Java分代堆 1. 宇宙初始化 JVM在启动的时候会初始化各种结构,比如模板解释器,类加载器,当然也包括这篇文章的主题,Java堆.在hotspot源码结构中gc/ ...
- Linux字符模式下如何设置/删除环境变量
Linux字符模式下设置/删除环境变量方法: bash下 设置:export 变量名=变量值 删除:unset 变量名 csh下 设置:setenv 变量名 变量值 删除:unsetenv 变量名 h ...
- Android中Drawable知识总结
本文是学习<Android开发艺术探索>中Drawable章节之后的一个总结. 一.常见的Drawable种类介绍 Drawable类 xml标签 描述 BitmapDrawable 表示 ...
- Java中将List转成逗号数组的方案
说明:逗号字符串转成数组或者List都是可以的,反过来依然可行:但是如果是List<String>转List<Integer>,基本误解,在Java 7只能for循环,如果在J ...