Deep Learning: Assuming a deep neural network is properly regulated, can adding more layers actually make the performance degrade?

I found this to be really puzzling. A deeper NN is supposed to be more powerful or at least equal to a shallower NN. I have already used dropout to prevent overfitting. How can the performance be degraded?
Yoshua's Answer
 

Yoshua Bengio, My lab has been one of the three that started the deep learning approach, bac...

Upvoted by Prateek Tandon, Robotics and Strong Artificial Intelligence Researcher• Paul King, Computational Neuroscientist, Technology Entrepreneur • Jack Rae,Google DeepMind Research Engineer
 
If you do not change the size of the layers and just add more layers, capacity should increase, so you could be overfitting. However, you should check whether training error increases or decreases. If it increases (which is also very plausible), it means that adding the layer made the optimization harder, with the optimization methods and initialization that you are using.  That could also explain your problem. However, if training error decreases and test error increases, you are overfitting.

Deep Learning: Assuming a deep neural network is properly regulated, can adding more layers actually make the performance degrade?的更多相关文章

  1. 【RS】Automatic recommendation technology for learning resources with convolutional neural network - 基于卷积神经网络的学习资源自动推荐技术

    [论文标题]Automatic recommendation technology for learning resources with convolutional neural network ( ...

  2. Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week1, Assignment(Gradient Checking)

    声明:所有内容来自coursera,作为个人学习笔记记录在这里. Gradient Checking Welcome to the final assignment for this week! In ...

  3. Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week1, Assignment(Initialization)

    声明:所有内容来自coursera,作为个人学习笔记记录在这里. Initialization Welcome to the first assignment of "Improving D ...

  4. Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week1, Assignment(Regularization)

    声明:所有内容来自coursera,作为个人学习笔记记录在这里. Regularization Welcome to the second assignment of this week. Deep ...

  5. [C1W1] Neural Networks and Deep Learning - Introduction to Deep Learning

    第一周:深度学习引言(Introduction to Deep Learning) 欢迎(Welcome) 深度学习改变了传统互联网业务,例如如网络搜索和广告.但是深度学习同时也使得许多新产品和企业以 ...

  6. Coursera, Deep Learning 2, Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week1, Course

    Train/Dev/Test set Bias/Variance Regularization  有下面一些regularization的方法. L2 regularation drop out da ...

  7. Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week2, Assignment(Optimization Methods)

    声明:所有内容来自coursera,作为个人学习笔记记录在这里. 请不要ctrl+c/ctrl+v作业. Optimization Methods Until now, you've always u ...

  8. 吴恩达《深度学习》-课后测验-第一门课 (Neural Networks and Deep Learning)-Week 3 - Shallow Neural Networks(第三周测验 - 浅层神 经网络)

    Week 3 Quiz - Shallow Neural Networks(第三周测验 - 浅层神经网络) \1. Which of the following are true? (Check al ...

  9. Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week3, Hyperparameter tuning, Batch Normalization and Programming Frameworks

    Tuning process 下图中的需要tune的parameter的先后顺序, 红色>黄色>紫色,其他基本不会tune. 先讲到怎么选hyperparameter, 需要随机选取(sa ...

随机推荐

  1. 通过URLHttpConnection方式连接网络步骤,获取位图为例

    要注意的是:访问网络不能直接放在主线程,要放在另外一个线程里面,如果放在主线程会报android.os.NetworkOnMainThreadException错误1 public Bitmap ge ...

  2. 关于js中立即执行的匿名函数写法

    /*最流行的写法*/ (function() { alert("run!") })(); /* !号可以有1~正无穷个,所以这一种就可以衍生无数种方式 */ !!!(functio ...

  3. 使用WIF实现单点登录Part II —— Windows Identity Foundation基本原理

    在上一篇文章中,我们已经使用WIF构建了一个基于MVC4的简单的身份验证程序,在这篇文章里,我们将探讨一下到底什么是WIF,以及它的工作原理.然后在下一篇文章开始,我们将实际操作,实现单点登录功能. ...

  4. PuTTY 中文教程

    PuTTY 中文教程 更新记录 2006-11-29初步完成想写的这些东西 2007-06-11PuTTY 的最新版本到了0.6:修改了一下 SSH 隧道:添加了 SSH 反向隧道:添加了用 SSH ...

  5. Linux环境下常用regexp的使用

    正则表达式 REGular EXPression   的简写元字符 匹配次数 位置锚定 分组 --------------------------------------元字符. 匹配任意单个字符 [ ...

  6. kettle删除资源库中的转换或者作业

    在资源库中新建转换,作业都很简单,那么加入现在不需要其中某个转换或者作业该怎么办呢? 下图是已经存在的转换跟作业 现在需要删除aa这个转换 操作步骤如下: 1.工具----资源库----探索资源库 出 ...

  7. 微软云平台媒体服务实践系列 2- 使用动态封装为iOS, Android , Windows 等多平台提供视频点播(VoD)方案

    文章微软云平台媒体服务实践系列 1- 使用静态封装为iOS, Android 设备实现点播(VoD)方案  介绍了如何针对少数iOS, Android 客户端的场景,出于节约成本的目的使用媒体服务的静 ...

  8. centos下安装nagios

    摘要Nagios是一款开源的免费网络监视工具,能有效监控Windows.Linux和Unix的主机状态,交换机路由器等网络设置,打印机等. Nagios是一款开源的免费网络监视工具,能有效监控Wind ...

  9. .net 使用validator做数据校验

    概述 在把用户输入的数据存储到数据库之前一般都要对数据做服务端校验,于是想到了.net自带的数据校验框架validator.本文对validator的使用方法进行介绍,并分析下校验的的原理. 使用va ...

  10. Qt使用QStackedWidget实现堆栈窗口

    Qt使用QStackedWidget实现堆栈窗口 分类: QT2012-07-25 21:59 6997人阅读 评论(0) 收藏 举报 qtlistsignal 堆栈窗口可以根据选择项的不同显示不同的 ...