why deep learning works
111
【直观详解】什么是正则化
李宏毅 / 一天搞懂深度學習
gradient descent
http://www.deeplearningbook.org/contents/numerical.html
http://cs231n.github.io/neural-networks-3/
https://arxiv.org/pdf/1609.04747.pdf
http://www.deeplearningbook.org/contents/optimization.html
https://www.quora.com/Is-a-single-layered-ReLu-network-still-a-universal-approximator/answer/Conner-Davis-2
https://www.analyticsvidhya.com/blog/2017/04/comparison-between-deep-learning-machine-learning/
softmax
https://www.quora.com/What-is-the-intuition-behind-SoftMax-function/answer/Sebastian-Raschka-1
https://eli.thegreenplace.net/2016/the-softmax-function-and-its-derivative/
http://ufldl.stanford.edu/tutorial/supervised/SoftmaxRegression/
https://www.quora.com/What-is-the-role-of-the-activation-function-in-a-neural-network-How-does-this-function-in-a-human-neural-network-system
Important
http://www.cs.toronto.edu/~fleet/courses/cifarSchool09/slidesBengio.pdf
https://www.iro.umontreal.ca/~bengioy/papers/ftml_book.pdf
https://medium.com/@vivek.yadav/how-neural-networks-learn-nonlinear-functions-and-classify-linearly-non-separable-data-22328e7e5be1
如何通俗易懂地解释卷积?
https://www.zhihu.com/question/22298352?rf=21686447
卷积神经网络工作原理直观的解释?
https://www.zhihu.com/question/39022858
https://mlnotebook.github.io/post/
https://zhuanlan.zhihu.com/p/28478034
http://timdettmers.com/2015/03/26/convolution-deep-learning/
https://www.quora.com/Is-ReLU-a-linear-piece-wise-linear-or-non-linear-activation-function
=========
transfer learning
https://www.quora.com/Why-is-deep-learning-so-easy
===============
https://www.quora.com/How-can-I-learn-Deep-Learning-quickly
What is a simple explanation of how artificial neural networks work?
How can I learn Deep Learning quickly?
https://www.quora.com/How-can-I-learn-Deep-Learning-quickly
https://www.quora.com/Why-do-neural-networks-need-more-than-one-hidden-layer
bengioy
https://www.iro.umontreal.ca/~bengioy/papers/ftml_book.pdf
https://www.iro.umontreal.ca/~lisa/pointeurs/TR1312.pdf
http://videolectures.net/deeplearning2015_bengio_theoretical_motivations/
http://www.cs.toronto.edu/~fleet/courses/cifarSchool09/slidesBengio.pdf
Universal Approximation Theorem
https://pdfs.semanticscholar.org/f22f/6972e66bdd2e769fa64b0df0a13063c0c101.pdf
http://www.cs.cmu.edu/~epxing/Class/10715/reading/Kornick_et_al.pdf
「Deep Learning」读书系列分享第四章:数值计算 | 分享总结
Nonlinear Classifiers
https://www.quora.com/Why-do-neural-networks-need-an-activation-function
http://ai.stanford.edu/~quocle/tutorial1.pdf
http://cs231n.github.io/neural-networks-1/
NN,CNN
https://www.analyticsvidhya.com/blog/2017/04/comparison-between-deep-learning-machine-learning/
[CV] 通俗理解『卷积』——从傅里叶变换到滤波器
https://zhuanlan.zhihu.com/p/28478034
如何通俗易懂地解释卷积?
https://www.zhihu.com/question/22298352?rf=21686447
http://yann.lecun.com/exdb/publis/pdf/lecun-01a.pdf
https://ujjwalkarn.me/2016/08/11/intuitive-explanation-convnets/
https://mlnotebook.github.io/post/CNN1/
http://bamos.github.io/2016/08/09/deep-completion/
https://www.analyticsvidhya.com/blog/2017/05/gpus-necessary-for-deep-learning/
Applied Deep Learning - Part 1: Artificial Neural Networks
Papaer
dropout ----Hinton
https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf
Neural Network with Unbounded Activation Functions is Universal Approximator
https://arxiv.org/pdf/1505.03654.pdf
Transfer Learning
Paper by Yoshua Bengio (another deep learning pioneer).
Paper by Ali Sharif Razavian.
Paper by Jeff Donahue.
Paper and subsequent paper by Dario Garcia-Gasulla.
overfitting
https://medium.com/towards-data-science/deep-learning-overfitting-846bf5b35e24
名校课程
cs231
http://www.jianshu.com/p/182baeb82c71
https://www.coursera.org/learn/neural-networks
收费视频
https://www.udemy.com/deeplearning/?siteID=mDjthAvMbf0-ZE2EvHFczLauDLzv0OQAKg&LSNPUBID=mDjthAvMbf0
Paper
The Power of Depth for Feedforward Neural Networks
https://arxiv.org/pdf/1512.03965.pdf?platform=hootsuite
Deep Residual Learning for Image Recognition
https://arxiv.org/pdf/1512.03385v1.pdf
Speed/accuracy trade-offs for modern convolutional object detectors
https://arxiv.org/pdf/1611.10012.pdf
Playing Atari with Deep Reinforcement Learning
https://arxiv.org/pdf/1312.5602v1.pdf
Neural Network with Unbounded Activation Functions is Universal Approximator
https://arxiv.org/pdf/1505.03654.pdf
Transfer learning
https://databricks.com/blog/2017/06/06/databricks-vision-simplify-large-scale-deep-learning.html
TensorFlow Object Detection API
https://github.com/tensorflow/models/tree/477ed41e7e4e8a8443bc633846eb01e2182dc68a/object_detection
https://opensource.googleblog.com/2017/06/supercharge-your-computer-vision-models.html
Supercharge your Computer Vision models with the TensorFlow Object Detection API
https://research.googleblog.com/2017/06/supercharge-your-computer-vision-models.html
如何使用TensorFlow API构建视频物体识别系统
https://www.jiqizhixin.com/articles/2017-07-14-5
谷歌开放的TensorFlow Object Detection API 效果如何?对业界有什么影响?
https://www.zhihu.com/question/61173908
https://stackoverflow.com/questions/42364513/how-to-recognise-multiple-objects-in-the-same-image
利用TensorFlow Object Detection API 训练自己的数据集
https://zhuanlan.zhihu.com/p/27469690
谷歌开放的TensorFlow Object Detection API 效果如何?对业界有什么影响?
https://github.com/tensorflow/models/tree/master/research/object_detection/data
https://stackoverflow.com/questions/44973184/train-tensorflow-object-detection-on-own-dataset
脑科学
https://www.quora.com/What-are-the-parts-of-the-neuron-and-their-function
why deep learning works的更多相关文章
- Why Deep Learning Works – Key Insights and Saddle Points
Why Deep Learning Works – Key Insights and Saddle Points A quality discussion on the theoretical mot ...
- Decision Boundaries for Deep Learning and other Machine Learning classifiers
Decision Boundaries for Deep Learning and other Machine Learning classifiers H2O, one of the leading ...
- Growing Pains for Deep Learning
Growing Pains for Deep Learning Advances in theory and computer hardware have allowed neural network ...
- Why deep learning?
1. 深度学习中网络越深越好么? 理论上说是这样的,因为网络越深,参数也越多,拟合能力也越强(但实际情况是,网络很深的时候,不容易训练,使得表现能力可能并不好). 2. 那么,不同什么深度的网络,在参 ...
- Use of Deep Learning in Modern Recommendation System: A Summary of Recent Works(笔记)
注意:论文中,很多的地方出现baseline,可以理解为参照物的意思,但是在论文中,我们还是直接将它称之为基线,也 就是对照物,参照物. 这片论文中,作者没有去做实际的实验,但是却做了一件很有意义的事 ...
- (转) The major advancements in Deep Learning in 2016
The major advancements in Deep Learning in 2016 Pablo Tue, Dec 6, 2016 in MACHINE LEARNING DEEP LEAR ...
- (转) Deep Learning Research Review Week 2: Reinforcement Learning
Deep Learning Research Review Week 2: Reinforcement Learning 转载自: https://adeshpande3.github.io/ad ...
- deep learning 的综述
从13年11月初开始接触DL,奈何boss忙or 各种问题,对DL理解没有CSDN大神 比如 zouxy09等 深刻,主要是自己觉得没啥进展,感觉荒废时日(丢脸啊,这么久....)开始开文,即为记录自 ...
- Deep Learning 16:用自编码器对数据进行降维_读论文“Reducing the Dimensionality of Data with Neural Networks”的笔记
前言 论文“Reducing the Dimensionality of Data with Neural Networks”是深度学习鼻祖hinton于2006年发表于<SCIENCE > ...
随机推荐
- shell下获取系统时间
shell下获取系统时间的方法直接调用系统变量 获取今天时期:`date +%Y%m%d` 或 `date +%F` 或 $(date +%y%m%d) 获取昨天时期:`date -d yesterd ...
- JS 实现打印
<input id="btnPrint" type="button" value="打印预览" onclick=preview(1) ...
- skatebroads
skateboardsn.滑板( skateboard的名词复数 ) == skateboard英 [ˈskeɪtbɔ:d] . 斯给特博得. 美 [ˈskeɪtbɔ:rd] n.滑板复数: ska ...
- pip安装django失败
pip install django时提示 Cannot fetch index base URL https://mirrors.tuna.tsinghua.edu.cn/pypi/simple/, ...
- Java基础-对象的内存分配与初始化(一定要明白的干货)
首先,什么是类的加载?类的加载由类加载器执行.该步骤将查找字节码(classpath指定目录),并从这些字节码中创建一个Class对象.Java虚拟机为每种类型管理一个独一无二的Class对象.也就是 ...
- Python直接控制鼠标键盘
Python直接控制鼠标键盘 之前因为期末的原因已经很久没写博客了,今天博主发现一个好玩的模块PyAutoGUI,借助它可以使用Python脚本直接控制键盘鼠标,感觉可以解决很多无聊的机械运动.这里记 ...
- JAVA中使用LOG4J记录日志(转)
在项目开发中,记录错误日志是一个很有必要功能.一是方便调试:二是便于发现系统运行过程中的错误:三是存储业务数据,便于后期分析: 在java中,记录日志,有很多种方式. 比如,自己实现. 自己写类,将日 ...
- C语言中字符输入问题
先上例题,一道太水太水的题, http://acm.hdu.edu.cn/showproblem.php?pid=1170 让做一个简单的计算器.然而入坑了. #include<stdio.h& ...
- 何谓sdk,何谓api
狭义上的 SDK 指 Windows SDK,包括在 Windows 平台进行开发的一系列头文件和库文件以及命令行工具等. API 是 SDK 提供给用户的函数,即接口就是这个 SDK 提供给你用于应 ...
- tfs2015 生成与发布 配置
先来看一张微软官方的自动生成与发布架构图,以便了解很多概念间的关系 1.安装好TFS2015(可以参考TFS2010的安装过程,尤其是账号权限相关),我自己是从TFS2010一路升级上来的(TFS20 ...