Theories of Deep Learning
https://stats385.github.io/readings
Lecture 1 – Deep Learning Challenge. Is There Theory?
Readings
- Deep Deep Trouble
- Why 2016 is The Global Tipping Point...
- Are AI and ML Killing Analyticals...
- The Dark Secret at The Heart of AI
- AI Robots Learning Racism...
- FaceApp Forced to Pull ‘Racist' Filters...
- Losing a Whole Generation of Young Men to Video Games
Lecture 2 – Overview of Deep Learning From a Practical Point of View
Readings
- Emergence of simple cell
- ImageNet Classification with Deep Convolutional Neural Networks (Alexnet)
- Very Deep Convolutional Networks for Large-Scale Image Recognition (VGG)
- Going Deeper with Convolutions (GoogLeNet)
- Deep Residual Learning for Image Recognition (ResNet)
- Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
- Visualizing and Understanding Convolutional Neural Networks
Blogs
Videos
Lecture 3
Readings
- A Mathematical Theory of Deep Convolutional Neural Networks for Feature Extraction
- Energy Propagation in Deep Convolutional Neural Networks
- Discrete Deep Feature Extraction: A Theory and New Architectures
- Topology Reduction in Deep Convolutional Feature Extraction Networks
Lecture 4
Readings
- A Probabilistic Framework for Deep Learning
- Semi-Supervised Learning with the Deep Rendering Mixture Model
- A Probabilistic Theory of Deep Learning
Lecture 5
Readings
- Why and When Can Deep-but Not Shallow-networks Avoid the Curse of Dimensionality: A Review
- Learning Functions: When is Deep Better Than Shallow
Lecture 6
Readings
- Convolutional Patch Representations for Image Retrieval: an Unsupervised Approach
- Convolutional Kernel Networks
- Kernel Descriptors for Visual Recognition
- End-to-End Kernel Learning with Supervised Convolutional Kernel Networks
- Learning with Kernels
- Kernel Based Methods for Hypothesis Testing
Lecture 7
Readings
- Geometry of Neural Network Loss Surfaces via Random Matrix Theory
- Resurrecting the sigmoid in deep learning through dynamical isometry: theory and practice
- Nonlinear random matrix theory for deep learning
Lecture 8
Readings
- Deep Learning without Poor Local Minima
- Topology and Geometry of Half-Rectified Network Optimization
- Convexified Convolutional Neural Networks
- Implicit Regularization in Matrix Factorization
Lecture 9
Readings
- Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position
- Perception as an inference problem
- A Neurobiological Model of Visual Attention and Invariant Pattern Recognition Based on Dynamic Routing of Information
Lecture 10
Readings
- Working Locally Thinking Globally: Theoretical Guarantees for Convolutional Sparse Coding
- Convolutional Neural Networks Analyzed via Convolutional Sparse Coding
- Multi-Layer Convolutional Sparse Modeling: Pursuit and Dictionary Learning
- Convolutional Dictionary Learning via Local Processing
To be discussed and extra
- Emergence of simple cell by Olshausen and Field
- Auto-Encoding Variational Bayes by Kingma and Welling
- Generative Adversarial Networks by Goodfellow et al.
- Understanding Deep Learning Requires Rethinking Generalization by Zhang et al.
- Deep Neural Networks with Random Gaussian Weights: A Universal Classification Strategy? by Giryes et al.
- Robust Large Margin Deep Neural Networks by Sokolic et al.
- Tradeoffs between Convergence Speed and Reconstruction Accuracy in Inverse Problems by Giryes et al.
- Understanding Trainable Sparse Coding via Matrix Factorization by Moreau and Bruna
- Why are Deep Nets Reversible: A Simple Theory, With Implications for Training by Arora et al.
- Stable Recovery of the Factors From a Deep Matrix Product and Application to Convolutional Network by Malgouyres and Landsberg
- Optimal Approximation with Sparse Deep Neural Networks by Bolcskei et al.
- Convolutional Rectifier Networks as Generalized Tensor Decompositions by Cohen and Shashua
- Emergence of Invariance and Disentanglement in Deep Representations by Achille and Soatto
- Deep Learning and the Information Bottleneck Principle by Tishby and Zaslavsky
Theories of Deep Learning的更多相关文章
- (转) Deep Learning in a Nutshell: Reinforcement Learning
Deep Learning in a Nutshell: Reinforcement Learning Share: Posted on September 8, 2016by Tim Dettm ...
- Machine and Deep Learning with Python
Machine and Deep Learning with Python Education Tutorials and courses Supervised learning superstiti ...
- The Brain vs Deep Learning Part I: Computational Complexity — Or Why the Singularity Is Nowhere Near
The Brain vs Deep Learning Part I: Computational Complexity — Or Why the Singularity Is Nowhere Near ...
- Decision Boundaries for Deep Learning and other Machine Learning classifiers
Decision Boundaries for Deep Learning and other Machine Learning classifiers H2O, one of the leading ...
- What are some good books/papers for learning deep learning?
What's the most effective way to get started with deep learning? 29 Answers Yoshua Bengio, ...
- (转)Understanding Memory in Deep Learning Systems: The Neuroscience, Psychology and Technology Perspectives
Understanding Memory in Deep Learning Systems: The Neuroscience, Psychology and Technology Perspecti ...
- [C3] Andrew Ng - Neural Networks and Deep Learning
About this Course If you want to break into cutting-edge AI, this course will help you do so. Deep l ...
- Deep learning:五十一(CNN的反向求导及练习)
前言: CNN作为DL中最成功的模型之一,有必要对其更进一步研究它.虽然在前面的博文Stacked CNN简单介绍中有大概介绍过CNN的使用,不过那是有个前提的:CNN中的参数必须已提前学习好.而本文 ...
- 【深度学习Deep Learning】资料大全
最近在学深度学习相关的东西,在网上搜集到了一些不错的资料,现在汇总一下: Free Online Books by Yoshua Bengio, Ian Goodfellow and Aaron C ...
随机推荐
- ASP.NET WebForm Form表单如何实现MVC那种“自动装配”效果呢?
我们知道ASP.NET MVC有个强大的地方就是Form表单提交到action的时候,可以直接将Form的参数直接装配到action的参数实体对象中 比如 action方法 Register(User ...
- HTML 5 应用程序缓存(Application Cache)cache manifest 文件使用 html5 中创建manifest缓存以及更新方法 一个manifest文件会创建一份缓存,不同的manifest文件其缓存的内容是互不干扰的
HTML5 离线缓存-manifest简介 HTML 5 应用程序缓存 使用 HTML5,通过创建 cache manifest 文件,可以轻松地创建 web 应用的离线版本. 什么是应用程序缓存(A ...
- 【Spring】spring的7个模块
Spring 是一个开源框架,是为了解决企业应用程序开发复杂性而创建的.框架的主要优势之一就是其分层架构,分层架构允许您选择使用哪一个组件,同时为 J2EE 应用程序开发提供集成的框架. Spring ...
- Lomboz插件
2008年05月20日 星期二 下午 01:47 Lomboz是Eclipse的一个主要的开源插件(open-source plug-in),Lomboz插件能够使Java开发者更好的使用Eclips ...
- MySQL加载配置文件的顺序
MySQL5.6启动时,按照下表,从上往下的顺序加载配置文件: File Name Purpose /etc/my.cnf Global options /etc/mysql/my.cnf Globa ...
- 利用 PowerShell 分析SharePoint WebApplication 体系结构
之前一篇文章<两张图看清SharePoint 2013 Farm 逻辑体系结构>谈到Web Application,Content Database,Site Collection的关系. ...
- 安装和配置SharePoint 2013 Workflow
SharePoint 2013中的工作流概述 在SharePoint 2013中,Workflow(建立在Windows Workflow Foundation 4.5)和WCF承载在Workflow ...
- The password supplied with the username Domain\UserName was not correct. Verify that it was entered correctly and try again
起因 今天想进入SharePoint 2013 Central Administration创建一个WebApplication,尽然发生了错误: The password supplied with ...
- stm8 时钟输出引脚
CLK_CCO引脚是STM8的时钟输出引脚,若设置该脚输出主时钟Fmaster,时钟输出寄存器可以进行如下操作 CLK->CCOR=0X19;
- ASTER:An Attentional Scene Text Recognizer with Flexible Rectification
代码链接:https://github.com/bgshih/aster 方法概述 本文方法主要解决不规则排列文字的文字识别问题,论文为之前一篇CVPR206的paper(Robust Scene T ...