目录

Glorot X, Bengio Y. Understanding the difficulty of training deep feedforward neural networks[C]. international conference on artificial intelligence and statistics, 2010: 249-256.

@article{glorot2010understanding,

title={Understanding the difficulty of training deep feedforward neural networks},

author={Glorot, Xavier and Bengio, Yoshua},

pages={249--256},

year={2010}}

本文提出了Xavier参数初始化方法.

主要内容

在第\(i=1, \ldots, d\)层:

\[\mathbf{s}^i=\mathbf{z}^i W^i+\mathbf{b}^i \\
\mathbf{z}^{i+1}= f(\mathbf{s}^i),
\]

其中\(\mathbf{z}^i\)是第\(i\)层的输入, \(\mathbf{s}^i\)是激活前的值, \(f(\cdot)\)是激活函数(假设其在0点对称, 且\(f'(0)=1\) 如tanh).

\[\mathrm{Var}(z^i) = n_l\mathrm{Var}(w^iz^i),
\]

在\(0\)附近近似成立(既然\(f'(0)=1\)), 其中\(z^i, w^i,\)分别是\(\mathbf{z}^i, W^i\)的某个元素, 且假设这些\(\{w^i\}\)之间是独立同分布的, \(w^i, z^i\)是相互独立的, 进一步假设\(\mathbb{E}(w^i)=0,\mathbb{E}(x)=0\)(\(x\)是输入的样本), 则

\[\mathrm{Var}(z^i) = n_l\mathrm{Var}(w^i)\mathrm{Var}(z^i),
\]

在\(0\)点附近近似成立.

\[\mathrm{Var}(z^i) = \mathrm{Var}(x) \prod_{i'=0}^{i-1} n_{i'} \mathrm{Var}(w_{i'})
\]

其中\(n_i\)表示第\(i\)层输入的节点个数.

根据梯度反向传播可知:

\[\tag{2}
\frac{\partial Cost}{\partial s_k^i} = f'(s_k^i) W_{k, \cdot}^{i+1} \frac{\partial Cost}{\partial \mathbf{s}^{i+1}}
\]
\[\tag{3}
\frac{\partial Cost}{\partial w_{l,k}^i} = z_l^i \frac{\partial Cost}{\partial s_k^i}.
\]

于是

\[\tag{6}
\mathrm{Var}[\frac{\partial Cost}{\partial s_k^i}] = \mathrm{Var}[\frac{\partial Cost}{\partial s^d}] \prod_{i'=i}^d n_{i'+1} \mathrm{Var} [w^{i'}],
\]
\[\mathrm{Var}[\frac{\partial Cost}{\partial w^i}] = \prod_{i'=0}^{i-1} n_{i'} \mathrm{Var}[w^{i'}] \prod_{i'=i}^d n_{i'+1} \mathrm{Var} [w^{i'}] \times \mathrm{Var}(x) \mathrm{Var}[\frac{\partial Cost}{\partial s^d}],
\]

当我们要求前向进程中关于\(z^i\)的方差一致, 则

\[\tag{10}
\forall i, \quad n_i \mathrm{Var} [w^i]=1.
\]

当我们要求反向进程中梯度的方差\(\frac{\partial Cost}{\partial s^i}\)一致, 则

\[\tag{11}
\forall i \quad n_{i+1} \mathrm{Var} [w^i]=1.
\]

本文选了一个折中的方案

\[\mathrm{Var} [w^i] = \frac{2}{n_{i+1}+n_{i}},
\]

并构造了一个均匀分布, \(w^i\)从其中采样

\[w^i \sim U[-\frac{\sqrt{6}}{\sqrt{n_{i+1}+n_{i}}},\frac{\sqrt{6}}{\sqrt{n_{i+1}+n_{i}}}].
\]

文章还有许多关于不同的激活函数的分析, 如sigmoid, tanh, softsign... 这些不是重点, 就不记录了.

[Xavier] Understanding the difficulty of training deep feedforward neural networks的更多相关文章

  1. Xavier——Understanding the difficulty of training deep feedforward neural networks

    1. 摘要 本文尝试解释为什么在深度的神经网络中随机初始化会让梯度下降表现很差,并且在此基础上来帮助设计更好的算法. 作者发现 sigmoid 函数不适合深度网络,在这种情况下,随机初始化参数会让较深 ...

  2. Understanding the difficulty of training deep feedforward neural networks

    本文作者为:Xavier Glorot与Yoshua Bengio. 本文干了点什么呢? 第一步:探索了不同的激活函数对网络的影响(包括:sigmoid函数,双曲正切函数和softsign y = x ...

  3. Deep learning_CNN_Review:A Survey of the Recent Architectures of Deep Convolutional Neural Networks——2019

    CNN综述文章 的翻译 [2019 CVPR] A Survey of the Recent Architectures of Deep Convolutional Neural Networks 翻 ...

  4. Understanding the Effective Receptive Field in Deep Convolutional Neural Networks

    Understanding the Effective Receptive Field in Deep Convolutional Neural Networks 理解深度卷积神经网络中的有效感受野 ...

  5. AlexNet论文翻译-ImageNet Classification with Deep Convolutional Neural Networks

    ImageNet Classification with Deep Convolutional Neural Networks 深度卷积神经网络的ImageNet分类 Alex Krizhevsky ...

  6. Image Scaling using Deep Convolutional Neural Networks

    Image Scaling using Deep Convolutional Neural Networks This past summer I interned at Flipboard in P ...

  7. (转) Ensemble Methods for Deep Learning Neural Networks to Reduce Variance and Improve Performance

    Ensemble Methods for Deep Learning Neural Networks to Reduce Variance and Improve Performance 2018-1 ...

  8. 中文版 ImageNet Classification with Deep Convolutional Neural Networks

    ImageNet Classification with Deep Convolutional Neural Networks 摘要 我们训练了一个大型深度卷积神经网络来将ImageNet LSVRC ...

  9. 深度学习的集成方法——Ensemble Methods for Deep Learning Neural Networks

    本文主要参考Ensemble Methods for Deep Learning Neural Networks一文. 1. 前言 神经网络具有很高的方差,不易复现出结果,而且模型的结果对初始化参数异 ...

随机推荐

  1. A Child's History of England.45

    To forgive these unworthy princes was only to afford them breathing-time for new faithlessness. They ...

  2. Linux信号1

    信号(signal)是一种软中断,他提供了一种处理异步事件的方法,也是进程间唯一的异步通信方式.在Linux系统中,根据POSIX标准扩展以后的信号机制,不仅可以用来通知某进程发生了什么事件,还可以给 ...

  3. Android,iOS系统有什么区别

    两者运行机制不同:IOS采用的是沙盒运行机制,安卓采用的是虚拟机运行机制.Android是一种基于Linux的自由及开源的操作系统,iOS是由苹果公司开发的移动操作系统IOS中用于UI指令权限最高,安 ...

  4. 3.3 rust HashMap

    The type HashMap<K, V> stores a mapping of keys of type K to values of type V. It does this vi ...

  5. 使用 ACE 库框架在 UNIX 中开发高性能并发应用

    使用 ACE 库框架在 UNIX 中开发高性能并发应用来源:developerWorks 中国 作者:Arpan Sen ACE 开放源码工具包可以帮助开发人员创建健壮的可移植多线程应用程序.本文讨论 ...

  6. js实现点击不同按钮切换内容

    <!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8&quo ...

  7. js格式化合计金额

    var summoney=1040.010400000000000001; var totalMoney=parseFloat(summoney).toFixed(2); var arry=total ...

  8. .Net Core MVC全局过滤器验证是否需要登录

    1.新增全局登录过滤器LoginCheckAttribute 1 public class LoginCheckAttribute: ActionFilterAttribute 2 { 3 publi ...

  9. Shell if 参数含义列表

    [ -a FILE ]  如果 FILE 存在则为真.   [ -b FILE ]  如果 FILE 存在且是一个块特殊文件则为真.   [ -c FILE ]  如果 FILE 存在且是一个字特殊文 ...

  10. 转:Intent 操作常用URI代码示例

    以下是常用到的Intent的URI及其示例,包含了大部分应用中用到的共用Intent 一.打开一个网页,类别是Intent.ACTION_VIEW 1 2 Uri uri = Uri.parse(&q ...