Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week2, Optimization algorithms
Gradient descent
Batch Gradient Decent, Mini-batch gradient descent, Stochastic gradient descent
还有很多比gradient decent 更优化的算法,在了解这些算法前,需要先理解 Exponentially weighted averages 这个概念
Exponentially weighted average 是一种计算平均值的方法,非常省storage 和 memory, 但是不是很精确。 然后引出一个bias correction 的概念,就是为了能使得 Exponentially weighted average 更加精确.
momentum (or called Gradient descent with momentum)
传统的Gradient descent 算法有如下图所示的问题 - 每次迭代都会来回跳动,不直接指向optimum, 在没有做feature scaling 的时候尤其明显。所以引出一个修正的算法 - Gradient descent with momentum.
RMSprop
目的和上面讲到的Momentum是一样的,就是使得每次迭代都尽量指向optimum而不是来回跳动. 算法实现如下. RMSprop带来的好处是迭代更快,和可以选用更大的learning rate.
Adam optimation algorithm:
结合了Momentum 和 RMSprop 两种算法. Adam stands for Adaptive mement estimation.
Learning rate decay
why? to reduce the oscillation near the central point.
有哪些实现方式呢?
Local optima and saddle point
在大型神经网络里,saddle point 可能比local optima更常见.
Ref:
Coursera, Deep leaning, Andrew Ng
Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week2, Optimization algorithms的更多相关文章
- 《Improving Deep Neural Networks:Hyperparameter tuning, Regularization and Optimization》课堂笔记
Lesson 2 Improving Deep Neural Networks:Hyperparameter tuning, Regularization and Optimization 这篇文章其 ...
- [C4] Andrew Ng - Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization
About this Course This course will teach you the "magic" of getting deep learning to work ...
- Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week1, Assignment(Initialization)
声明:所有内容来自coursera,作为个人学习笔记记录在这里. Initialization Welcome to the first assignment of "Improving D ...
- Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week1, Assignment(Gradient Checking)
声明:所有内容来自coursera,作为个人学习笔记记录在这里. Gradient Checking Welcome to the final assignment for this week! In ...
- Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week1, Assignment(Regularization)
声明:所有内容来自coursera,作为个人学习笔记记录在这里. Regularization Welcome to the second assignment of this week. Deep ...
- Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week2, Assignment(Optimization Methods)
声明:所有内容来自coursera,作为个人学习笔记记录在这里. 请不要ctrl+c/ctrl+v作业. Optimization Methods Until now, you've always u ...
- 课程二(Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization),第一周(Practical aspects of Deep Learning) —— 4.Programming assignments:Gradient Checking
Gradient Checking Welcome to this week's third programming assignment! You will be implementing grad ...
- 吴恩达《深度学习》-课后测验-第二门课 (Improving Deep Neural Networks:Hyperparameter tuning, Regularization and Optimization)-Week 1 - Practical aspects of deep learning(第一周测验 - 深度学习的实践)
Week 1 Quiz - Practical aspects of deep learning(第一周测验 - 深度学习的实践) \1. If you have 10,000,000 example ...
- 吴恩达《深度学习》-第二门课 (Improving Deep Neural Networks:Hyperparameter tuning, Regularization and Optimization)-第一周:深度学习的实践层面 (Practical aspects of Deep Learning) -课程笔记
第一周:深度学习的实践层面 (Practical aspects of Deep Learning) 1.1 训练,验证,测试集(Train / Dev / Test sets) 创建新应用的过程中, ...
- 课程二(Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization),第三周(Hyperparameter tuning, Batch Normalization and Programming Frameworks) —— 2.Programming assignments
Tensorflow Welcome to the Tensorflow Tutorial! In this notebook you will learn all the basics of Ten ...
随机推荐
- spring中的@Bean是否一定要与@Configuration一起用
版权声明:本文为博主原创文章,未经博主允许不得转载. https://blog.csdn.net/little_newBee/article/details/80383691 在使用sprin ...
- 爬虫 写入文件时遇到gbk编码错误
#获取视频地址 # 每次请求一次,然后写文件,这样可以规避多次请求触发反爬虫 r = requests.get('https://www.pearvideo.com/video_1522192') h ...
- 编译 pcre - 开源的正则表达式(库)
PCRE百科介绍: PCRE(Perl Compatible Regular Expressions)是一个Perl库,包括 perl 兼容的正则表达式库.这些在执行正规表达式模式匹配时用与Perl ...
- Django(十三)ajax 与 Bootstrap,font-awesome
prop,attr,val font-awesome:字体,图标库 对话框添加,删除,修改: 添加: Ajax偷偷向后台发请求: 1. 下载引入jQuery 2. $.ajax({ url: '/ad ...
- 第三十八篇-logcat的使用
很多時候,程序有问题时都需要debug,一般会设置几个信息点,查看程序是否运行,之前学过Toast,可以广播,但是终归是不太方便,今天学习一下logcat的用法. logcat里面是一些日志,内容太多 ...
- 如何计算Java对象所占内存的大小
[ 简单总结: 随便一个java项目,引入jar包: lucene-core-4.0.0.jar 如果是 maven项目,直接用如下依赖: <dependency> <groupId ...
- 构造代码块、this关键字、静态变量、静态代码块、主函数
一.构造代码块: 作用:给对象进行初始化. 特点:对象一经运行就执行(与变量声明时赋初值同级别,此处注意 非法前向引用) 优先于构造函数的执行. 与构造函数的区别: 构造代码块是给所有对象统一初始化. ...
- 跨域技术(JSONP与CROS)
JSONP 我们发现,Web页面上调用js文件时不受是否跨域的影响,凡是拥有"src"这个属性的标签都拥有跨域的能力,比如<script>.<img>.&l ...
- CentOS 6.x 最小化安装推荐安装的依赖包和修改内核参数
CentOS 6.x 最小化安装推荐安装的依赖包 我在日常工作中,新建的xenserver的虚拟机,CentOS release 6.9 (Final)操作系统,采用最小化安装,后续很多操作需要各种依 ...
- Reference-TMB
Paper Name:Targeted Next Generation Sequencing Identifies Markers of Response to PD-1 Blockade Adress ...