Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week2, Optimization algorithms
Gradient descent
Batch Gradient Decent, Mini-batch gradient descent, Stochastic gradient descent



还有很多比gradient decent 更优化的算法,在了解这些算法前,需要先理解 Exponentially weighted averages 这个概念



Exponentially weighted average 是一种计算平均值的方法,非常省storage 和 memory, 但是不是很精确。 然后引出一个bias correction 的概念,就是为了能使得 Exponentially weighted average 更加精确.

momentum (or called Gradient descent with momentum)
传统的Gradient descent 算法有如下图所示的问题 - 每次迭代都会来回跳动,不直接指向optimum, 在没有做feature scaling 的时候尤其明显。所以引出一个修正的算法 - Gradient descent with momentum.


RMSprop
目的和上面讲到的Momentum是一样的,就是使得每次迭代都尽量指向optimum而不是来回跳动. 算法实现如下. RMSprop带来的好处是迭代更快,和可以选用更大的learning rate.

Adam optimation algorithm:
结合了Momentum 和 RMSprop 两种算法. Adam stands for Adaptive mement estimation.


Learning rate decay
why? to reduce the oscillation near the central point.

有哪些实现方式呢?


Local optima and saddle point
在大型神经网络里,saddle point 可能比local optima更常见.


Ref:
Coursera, Deep leaning, Andrew Ng
Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week2, Optimization algorithms的更多相关文章
- 《Improving Deep Neural Networks:Hyperparameter tuning, Regularization and Optimization》课堂笔记
Lesson 2 Improving Deep Neural Networks:Hyperparameter tuning, Regularization and Optimization 这篇文章其 ...
- [C4] Andrew Ng - Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization
About this Course This course will teach you the "magic" of getting deep learning to work ...
- Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week1, Assignment(Initialization)
声明:所有内容来自coursera,作为个人学习笔记记录在这里. Initialization Welcome to the first assignment of "Improving D ...
- Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week1, Assignment(Gradient Checking)
声明:所有内容来自coursera,作为个人学习笔记记录在这里. Gradient Checking Welcome to the final assignment for this week! In ...
- Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week1, Assignment(Regularization)
声明:所有内容来自coursera,作为个人学习笔记记录在这里. Regularization Welcome to the second assignment of this week. Deep ...
- Coursera Deep Learning 2 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization - week2, Assignment(Optimization Methods)
声明:所有内容来自coursera,作为个人学习笔记记录在这里. 请不要ctrl+c/ctrl+v作业. Optimization Methods Until now, you've always u ...
- 课程二(Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization),第一周(Practical aspects of Deep Learning) —— 4.Programming assignments:Gradient Checking
Gradient Checking Welcome to this week's third programming assignment! You will be implementing grad ...
- 吴恩达《深度学习》-课后测验-第二门课 (Improving Deep Neural Networks:Hyperparameter tuning, Regularization and Optimization)-Week 1 - Practical aspects of deep learning(第一周测验 - 深度学习的实践)
Week 1 Quiz - Practical aspects of deep learning(第一周测验 - 深度学习的实践) \1. If you have 10,000,000 example ...
- 吴恩达《深度学习》-第二门课 (Improving Deep Neural Networks:Hyperparameter tuning, Regularization and Optimization)-第一周:深度学习的实践层面 (Practical aspects of Deep Learning) -课程笔记
第一周:深度学习的实践层面 (Practical aspects of Deep Learning) 1.1 训练,验证,测试集(Train / Dev / Test sets) 创建新应用的过程中, ...
- 课程二(Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization),第三周(Hyperparameter tuning, Batch Normalization and Programming Frameworks) —— 2.Programming assignments
Tensorflow Welcome to the Tensorflow Tutorial! In this notebook you will learn all the basics of Ten ...
随机推荐
- 译:Spring Boot 自动伸缩
原文链接:https://dzone.com/articles/spring-boot-autoscaler 作者:Piotr Mińkowski 译者:helloworldtang 自动伸缩是每个人 ...
- 纪中2018暑假培训day5提高b组改题记录
因为今天省选组也做a组,以为今天a组会很难,就做了做b组.t1和t3强行暴力,好在有t2保底.t1和正解就差一点,然而考试时死活想不起来...... 今天改题可以少改一道了!ovo 救救孩子吧!t1T ...
- PHP开发APP接口之返回数据
首先说明一下客户端APP通信的格式 1.xml:扩展标记语言(1.用来标记数据,定义数据类型,是一种允许用户对自己的标记语言进行定义的源语言,xml格式统一,跨平台和语言,非常适合数据传输和通信,早已 ...
- JVM调优工具
JMap 首先要知道Java进程的pid. Windows: .. .. .. Linux: ps -ef | grep java 查看堆栈信息(jmap -heap pid) jmap -heap ...
- c#线程2
多线程中很有可能存在争夺一个变量资源而产生死锁或者不被期望的结果. 测试类; class TestClass { ; private object objLock = new object(); pu ...
- MySQL信息提示不是英文问题
安装好MySQL后,运行SQL的提示信息总不是英文mysql> select database; ERROR 1064 (42000): 安装好MySQL后,运行SQL的提示信息总不是英文 my ...
- jenkins-ant-jmeter
jenkins下通过ant执行jmeter脚本 先下个ant 解压开来 在到jenkins中设置:系统管理-全局工具配置-ant安装-新增ant,填上name和ant-home 将jmeter的ant ...
- go tcp
TCP编程 1.客户端和服务器 2.服务端的处理流程 监听端口 接收客户端的链接 创建goroutine,处理该链接 3.客户端的处理流程 建立与服务端的链接 进行数据收发 关闭链接 服务端代码 pa ...
- 【.net】在ASP.NET中,IE与Firefox下载文件名中带中文汉字的文件,文件名乱码的问题
#问题:客户端为ie或Firefox,服务端为asp.net时,下载文件名中包含中文汉字时,下载下来的文件的文件名是乱码: #解决方案: 示例代码:下载名称中带汉字的文件: public void P ...
- ACM-ICPC 2018 焦作赛区网络预赛 A Magic Mirror(签到)
https://nanti.jisuanke.com/t/31710 题意 若输入的是Jessie或jessie,输出Good guy!,否则输出Dare you say that again? 分析 ...