循环神经网络的简单实现: import tensorflow as tf x=[1,2] state=[0.0,0.0] w_cell_state=np.array([[0.1,0.2],[0.3,0.4]]) w_cell_input=np.array([0.5,0.6]) b_cell=np.array([0.1,-0.1]) w_output=np.array([1.0,2.0]) b_output=0.1 for i in range(len(x)): before_a=np.dot(s
dropout在前向神经网络中效果很好,但是不能直接用于RNN,因为RNN中的循环会放大噪声,扰乱它自己的学习.那么如何让它适用于RNN,就是只将它应用于一些特定的RNN连接上. LSTM的长期记忆是存在memory cell中的. The LSTM can decide to overwrite the memory cell, retrieve it, or keep it for the next time step. 主要思想: 将dropout用于非循环的连接.即上下层连接