http://neuralnetworksanddeeplearning.com/chap1.html

Up to now, we've been discussing neural networks where the output from one layer is used as input to the next layer. Such networks are called feedforward neural networks. This means there are no loops in the network - information is always fed forward, never fed back. If we did have loops, we'd end up with situations where the input to the σσ function depended on the output. That'd be hard to make sense of, and so we don't allow such loops.

However, there are other models of artificial neural networks in which feedback loops are possible. These models are calledrecurrent neural networks. The idea in these models is to have neurons which fire for some limited duration of time, before becoming quiescent. That firing can stimulate other neurons, which may fire a little while later, also for a limited duration. That causes still more neurons to fire, and so over time we get a cascade of neurons firing. Loops don't cause problems in such a model, since a neuron's output only affects its input at some later time, not instantaneously.

Recurrent neural nets have been less influential than feedforward networks, in part because the learning algorithms for recurrent nets are (at least to date) less powerful. But recurrent networks are still extremely interesting. They're much closer in spirit to how our brains work than feedforward networks. And it's possible that recurrent networks can solve important problems which can only be solved with great difficulty by feedforward networks. However, to limit our scope, in this book we're going to concentrate on the more widely-used feedforward networks.

They're much closer in spirit to how our brains work than feedforward networks.的更多相关文章

  1. 使用神经网络识别手写数字Using neural nets to recognize handwritten digits

    The human visual system is one of the wonders of the world. Consider the following sequence of handw ...

  2. 深度学习材料:从感知机到深度网络A Deep Learning Tutorial: From Perceptrons to Deep Networks

    In recent years, there’s been a resurgence in the field of Artificial Intelligence. It’s spread beyo ...

  3. 提高神经网络的学习方式Improving the way neural networks learn

    When a golf player is first learning to play golf, they usually spend most of their time developing ...

  4. (转) Written Memories: Understanding, Deriving and Extending the LSTM

    R2RT   Written Memories: Understanding, Deriving and Extending the LSTM Tue 26 July 2016 When I was ...

  5. Introduction to Deep Neural Networks

    Introduction to Deep Neural Networks Neural networks are a set of algorithms, modeled loosely after ...

  6. A Statistical View of Deep Learning (V): Generalisation and Regularisation

    A Statistical View of Deep Learning (V): Generalisation and Regularisation We now routinely build co ...

  7. 学习笔记之Machine Learning Crash Course | Google Developers

    Machine Learning Crash Course  |  Google Developers https://developers.google.com/machine-learning/c ...

  8. [Converge] Training Neural Networks

    CS231n Winter 2016: Lecture 5: Neural Networks Part 2 CS231n Winter 2016: Lecture 6: Neural Networks ...

  9. ICLR 2014 International Conference on Learning Representations深度学习论文papers

    ICLR 2014 International Conference on Learning Representations Apr 14 - 16, 2014, Banff, Canada Work ...

随机推荐

  1. etcd的原理分析

    k8s集群使用etcd作为它的数据后端,etcd是一种无状态的分布式数据存储集群. 数据以key-value的形式存储在其中. 今天同事针对etcd集群的运作原理做了一个讲座,总结一下. A. etc ...

  2. CocoaPods安装及相关命令

    具体安装参考: http://www.jianshu.com/p/dfe970588f95 http://www.jianshu.com/p/9e4e36ba8574 我就说一个,安装cocoapod ...

  3. URL相对路径和URL绝对路径

    经常在页面中引用图片,html页面等,自己常常弄错相对路径和绝对路径,今天写下此文总结一下.    直接举例说明吧. 在 D:\例子\html下有这么几个文件和文件夹     1.若引用的资源和本身在 ...

  4. 字典转模型的过程中,空值和id特殊字符的处理

    在IOS 中id是特殊字符,可是非常多时候从网络中下载的数据是以id保存的 假设在定义属性的时候 @property(nonatomic, copy) NSString *id; 就不会出现错误 当键 ...

  5. 【BIEE】03_BIEE数据源配置

    声明:此时说的是Oracle数据源配置 BIEE数据源配置有两种方法 ①直接使用字符串连接 ②将tnsnames.ora文件覆盖到obiee目录下 直接使用字符串 直接使用字符串连接很简单 首先打开资 ...

  6. 【VBA】获取当前工作表的用户名

    如何使用VBA获取当前工作表的用户名呢?请看如下代码: Sub 获取当前工作表的用户名() MsgBox "当前工作表的用户名为:" & Application.UserN ...

  7. block知识点

    1.block引用局部变量的时候,该变量会作为常量编码到block中,在block中不能被修改. 2.使用 __block修饰的局部变量,不会作为常量被编码到block中,故而在block中可以被修改 ...

  8. declare @t table

    DECLARE @t TABLE(date char(21))INSERT @t SELECT '1900-1-1 00:00:00.000'INSERT @t SELECT '1900-1-1 00 ...

  9. Linux学习笔记(三):系统执行级与执行级的切换

    1.Linux系统与其它的操作系统不同,它设有执行级别.该执行级指定操作系统所处的状态.Linux系统在不论什么时候都执行于某个执行级上,且在不同的执行级上执行的程序和服务都不同,所要完毕的工作和所要 ...

  10. Arcgis:什么是栅格数据类型

    栅格数据单元大小 栅格所表示的内容的详细程度(要素现象)通常取决于栅格的单元(像素)大小或空间分辨率. 单元必须足够小,这样才可以捕获到所需的详细信息: 而单元又必须足够大,这样才可以提高计算机存储和 ...