simpleRNN

训练集为《爱丽丝梦境》英文版txt文档,目标:根据随机给出的10个字符,生成可能的后100个字符

词向量空间生产

In [4]: INPUT_FILE = "./data/alice_in_wonderland.txt"

In [5]: fin = open(INPUT_FILE, 'rb')
...: lines = []
...: for line in fin:
...: line = line.strip().lower()
...: line = line.decode("ascii", "ignore")
...: if len(line) == 0:
...: continue
...: lines.append(line)
...: fin.close()
...: text = " ".join(lines)
...: In [6]: lines[:20]
Out[6]:
['down, down, down. there was nothing else to do, so alice soon began',
'talking again. "dinah\'ll miss me very much to-night, i should think!"',
'(dinah was the cat.) "i hope they\'ll remember her saucer of milk at',
'tea-time. dinah, my dear, i wish you were down here with me! there are',
"no mice in the air, i'm afraid, but you might catch a bat, and that's",
'very like a mouse, you know. but do cats eat bats, i wonder?" and here',
'alice began to get rather sleepy, and went on saying to herself, in a',
'dreamy sort of way, "do cats eat bats? do cats eat bats?" and sometimes,',
'"do bats eat cats?" for, you see, as she couldn\'t answer either',
"question, it didn't much matter which way she put it. she felt that she",
'was dozing off, and had just begun to dream that she was walking hand in',
'hand with dinah, and saying to her very earnestly, "now, dinah, tell me',
'the truth: did you ever eat a bat?" when suddenly, thump! thump! down',
'she came upon a heap of sticks and dry leaves, and the fall was over.',
'alice was not a bit hurt, and she jumped up on to her feet in a moment:',
'she looked up, but it was all dark overhead; before her was another long',
'passage, and the white rabbit was still in sight, hurrying down it.',
'there was not a moment to be lost: away went alice like the wind, and',
'was just in time to hear it say, as it turned a corner, "oh my ears and',
'whiskers, how late it\'s getting!" she was close behind it when she'] In [7]: text[:10]
Out[7]: 'down, down' In [8]: chars = set([c for c in text])
...: nb_chars = len(chars)
...: char2index = dict((c, i) for i, c in enumerate(chars))
...: index2char = dict((i, c) for i, c in enumerate(chars))
...: In [9]: nb_chars
Out[9]: 57 In [11]: char2index
Out[11]:
{' ': 49,
'!': 40,
'"': 4,
'$': 52,
'%': 28,
'&': 30,
"'": 17,
'(': 5,
')': 12,
'*': 21,
',': 0,
'-': 13,
'.': 45,
'/': 50,
'0': 51,
'1': 2,
'2': 16,
'3': 15,
'4': 54,
'5': 25,
'6': 48,
'7': 35,
'8': 37,
'9': 32,
':': 39,
';': 10,
'?': 29,
'@': 53,
'[': 11,
']': 47,
'_': 20,
'a': 24,
'b': 26,
'c': 34,
'd': 38,
'e': 27,
'f': 44,
'g': 23,
'h': 41,
'i': 18,
'j': 8,
'k': 7,
'l': 56,
'm': 1,
'n': 22,
'o': 6,
'p': 3,
'q': 14,
'r': 36,
's': 33,
't': 31,
'u': 9,
'v': 42,
'w': 19,
'x': 46,
'y': 43,
'z': 55} In [12]: len(text)
Out[12]: 159777 In [14]: SEQLEN = 10
...: STEP = 1
...:
...: input_chars = []
...: label_chars = []
...: for i in range(0, len(text) - SEQLEN, STEP):
...: input_chars.append(text[i:i + SEQLEN])
...: label_chars.append(text[i + SEQLEN])
...: In [15]: input_chars[:10]
Out[15]:
['down, down',
'own, down,',
'wn, down, ',
'n, down, d',
', down, do',
' down, dow',
'down, down',
'own, down.',
'wn, down. ',
'n, down. t'] In [16]: label_chars[:10]
Out[16]: [',', ' ', 'd', 'o', 'w', 'n', '.', ' ', 't', 'h'] In [17]: len(text)
Out[17]: 159777 In [18]: len(input_chars)
Out[18]: 159767 In [19]: len(label_chars)
Out[19]: 159767 In [20]: t=np.zeros((10,3,3)) In [21]: t
Out[21]:
array([[[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], [[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], [[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], [[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], [[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], [[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], [[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], [[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], [[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]], [[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]]]) In [22]: t=np.zeros((10,3)) In [23]: t
Out[23]:
array([[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.],
[0., 0., 0.]]) In [24]: X = np.zeros((len(input_chars), SEQLEN, nb_chars), dtype=np.bool)
...: y = np.zeros((len(input_chars), nb_chars), dtype=np.bool)
...: for i, input_char in enumerate(input_chars):
...: for j, ch in enumerate(input_char):
...: X[i, j, char2index[ch]] = 1
...: y[i, char2index[label_chars[i]]] = 1
...: In [25]: X[0]
Out[25]:
array([[False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, True, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False],
[False, False, False, False, False, False, True, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False],
[False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, True, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False],
[False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, True, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False],
[ True, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False],
[False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, True, False, False, False, False,
False, False, False],
[False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, True, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False],
[False, False, False, False, False, False, True, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False],
[False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, True, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False],
[False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, True, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False, False, False, False, False, False, False,
False, False, False]]) In [26]: X[0].shape
Out[26]: (10, 57) In [27]: input_chars[10]
Out[27]: ', down. th'

模型训练与预测

(base) C:\Users\杨景\Desktop\keras深度学习实战\DeepLearningwithKeras_Code\Chapter06>ipython
Python 3.6.4 |Anaconda, Inc.| (default, Jan 16 2018, 10:22:32) [MSC v.1900 64 bit (AMD64)]
Type 'copyright', 'credits' or 'license' for more information
IPython 6.2.1 -- An enhanced Interactive Python. Type '?' for help. In [1]: input_chars[:10]
---------------------------------------------------------------------------
NameError Traceback (most recent call last)
<ipython-input-1-15f893c11699> in <module>()
----> 1 input_chars[:10] NameError: name 'input_chars' is not defined In [2]: %run alice_chargen_rnn.py
F:\ana\lib\site-packages\h5py\__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
from ._conv import register_converters as _register_converters
Using TensorFlow backend.
Extracting text from input...
Creating input and label text...
Vectorizing input and label text...
==================================================
Iteration #: 0
Epoch 1/1
159767/159767 [==============================] - 29s 179us/step - loss: 2.3886
Generating from seed: d all the
d all the sor the she she she she she she she she she she she she she she she she she she she she she she she
==================================================
Iteration #: 1
Epoch 1/1
159767/159767 [==============================] - 26s 162us/step - loss: 2.0846
Generating from seed: t no restr
t no restre the wast on the sart in the sart in the sart in the sart in the sart in the sart in the sart in th
==================================================
Iteration #: 2
Epoch 1/1
159767/159767 [==============================] - 26s 162us/step - loss: 1.9825
Generating from seed: al damages
al damages an and the har her and the said the hat ere had alice and the dore to the dore to the dore to the d
==================================================
Iteration #: 3
Epoch 1/1
159767/159767 [==============================] - 26s 162us/step - loss: 1.8993
Generating from seed: rom being
rom being the mouse and the moute to the more tore to the more tore to the more tore to the more tore to the m
==================================================
Iteration #: 4
Epoch 1/1
159767/159767 [==============================] - 22s 136us/step - loss: 1.8309
Generating from seed: said alic
said alice, and she had fore the said to the king to ghe sore to the king to ghe sore to the king to ghe sore
==================================================
Iteration #: 5
Epoch 1/1
159767/159767 [==============================] - 22s 138us/step - loss: 1.7758
Generating from seed: l, if i mu
l, if i must the couster had her head with a little some of her head with a little some of her head with a lit
==================================================
Iteration #: 6
Epoch 1/1
159767/159767 [==============================] - 25s 156us/step - loss: 1.7290
Generating from seed: d up and r
d up and repponting to see to se project gutenberg-tm the sabe the could alice and the doon a little so the co
==================================================
Iteration #: 7
Epoch 1/1
159767/159767 [==============================] - 21s 129us/step - loss: 1.6894
Generating from seed: ows on it,
ows on it, and this a could not mest were not in a little she had see sous for and whin she had see sous for a
==================================================
Iteration #: 8
Epoch 1/1
159767/159767 [==============================] - 11s 67us/step - loss: 1.6551
Generating from seed: the botto
the botton with a little said to her find it the dormouse said the dormouse said the dormouse said the dormou
==================================================
Iteration #: 9
Epoch 1/1
159767/159767 [==============================] - 10s 64us/step - loss: 1.6249
Generating from seed: atures, wh
atures, what i sand the mork of the ont of the same the court and a little she said to herself a little she sa
==================================================
Iteration #: 10
Epoch 1/1
159767/159767 [==============================] - 10s 64us/step - loss: 1.5991
Generating from seed: en leaves
en leaves in a little word that she was now she was she was she was she was she was she was she was she was sh
==================================================
Iteration #: 11
Epoch 1/1
159767/159767 [==============================] - 10s 64us/step - loss: 1.5769
Generating from seed: oject gute
oject gutenberg-tm ate of the gryphon. "the king to her head the dormouse was a little and alice was not got t
==================================================
Iteration #: 12
Epoch 1/1
159767/159767 [==============================] - 10s 65us/step - loss: 1.5563
Generating from seed: "that's v
"that's very such a plowers the rabbit her feet the rabbit her feet the rabbit her feet the rabbit her feet t
==================================================
Iteration #: 13
Epoch 1/1
159767/159767 [==============================] - 10s 65us/step - loss: 1.5385
Generating from seed: ee the ear
ee the earing of the the great comation of the words of the toment of she the hatter. "i can't alice was not i
==================================================
Iteration #: 14
Epoch 1/1
159767/159767 [==============================] - 10s 65us/step - loss: 1.5226
Generating from seed: ng is, to
ng is, to the growing to the had hear hear hear hear hear hear hear hear hear hear hear hear hear hear hear he
==================================================
Iteration #: 15
Epoch 1/1
159767/159767 [==============================] - 10s 64us/step - loss: 1.5077
Generating from seed: " alice we
" alice were out it was a little she had net of the tome with a little she had net of the tome with a little s
==================================================
Iteration #: 16
Epoch 1/1
159767/159767 [==============================] - 10s 64us/step - loss: 1.4954
Generating from seed: r in a lan
r in a lanter alice was not a sing to the mock turtle so the caterpillar a cance of the conter alice was not a
==================================================
Iteration #: 17
Epoch 1/1
159767/159767 [==============================] - 10s 64us/step - loss: 1.4826
Generating from seed: nd - if yo
nd - if you dread a remesting and the project gutenberg-tm electronic works the project gutenberg-tm electroni
==================================================
Iteration #: 18
Epoch 1/1
159767/159767 [==============================] - 10s 65us/step - loss: 1.4732
Generating from seed: in bringin
in bringing the looked down at the mock turtle so mech a little so me went on the looked down at the mock turt
==================================================
Iteration #: 19
Epoch 1/1
159767/159767 [==============================] - 10s 64us/step - loss: 1.4613
Generating from seed: onour!" "d
onour!" "do you don't like that it was a rear the words to be a little she had been work the rabbit the conter
==================================================
Iteration #: 20
Epoch 1/1
159767/159767 [==============================] - 10s 65us/step - loss: 1.4520
Generating from seed: other par
other parted to be so one of the same the dormouse she heard a comply and alice was a little she had the dorm
==================================================
Iteration #: 21
Epoch 1/1
159767/159767 [==============================] - 10s 65us/step - loss: 1.4439
Generating from seed: n] "and ju
n] "and just as the hatter with the tome with the tome with the tome with the tome with the tome with the tome
==================================================
Iteration #: 22
Epoch 1/1
159767/159767 [==============================] - 10s 66us/step - loss: 1.4351
Generating from seed: med to be
med to be alice, "what so the dormouse said to herself in a little so the dormouse said to herself in a little
==================================================
Iteration #: 23
Epoch 1/1
159767/159767 [==============================] - 10s 64us/step - loss: 1.4285
Generating from seed: as it spo
as it spoke thing as she could be and the caterpillar of the court. "what said to herself an all this again t
==================================================
Iteration #: 24
Epoch 1/1
159767/159767 [==============================] - 10s 65us/step - loss: 1.4214
Generating from seed: her somet
her something the momert like the mouse of a from and she thought the moment she thought the moment she thoug

模型源码:

 -*- coding: utf-8 -*-
# Adapted from lstm_text_generation.py in keras/examples
from __future__ import print_function
from keras.layers.recurrent import SimpleRNN
from keras.models import Sequential
from keras.layers import Dense, Activation
import numpy as np INPUT_FILE = "./data/alice_in_wonderland.txt" # extract the input as a stream of characters
print("Extracting text from input...")
fin = open(INPUT_FILE, 'rb')
lines = []
for line in fin:
line = line.strip().lower()
line = line.decode("ascii", "ignore")
if len(line) == 0:
continue
lines.append(line)
fin.close()
text = " ".join(lines) # creating lookup tables
# Here chars is the number of features in our character "vocabulary"
chars = set([c for c in text])
nb_chars = len(chars)
char2index = dict((c, i) for i, c in enumerate(chars))
index2char = dict((i, c) for i, c in enumerate(chars)) # create inputs and labels from the text. We do this by stepping
# through the text ${step} character at a time, and extracting a
# sequence of size ${seqlen} and the next output char. For example,
# assuming an input text "The sky was falling", we would get the
# following sequence of input_chars and label_chars (first 5 only)
# The sky wa -> s
# he sky was ->
# e sky was -> f
# sky was f -> a
# sky was fa -> l
print("Creating input and label text...")
SEQLEN = 10
STEP = 1 input_chars = []
label_chars = []
for i in range(0, len(text) - SEQLEN, STEP):
input_chars.append(text[i:i + SEQLEN])
label_chars.append(text[i + SEQLEN]) # vectorize the input and label chars
# Each row of the input is represented by seqlen characters, each
# represented as a 1-hot encoding of size len(char). There are
# len(input_chars) such rows, so shape(X) is (len(input_chars),
# seqlen, nb_chars).
# Each row of output is a single character, also represented as a
# dense encoding of size len(char). Hence shape(y) is (len(input_chars),
# nb_chars).
print("Vectorizing input and label text...")
X = np.zeros((len(input_chars), SEQLEN, nb_chars), dtype=np.bool)
y = np.zeros((len(input_chars), nb_chars), dtype=np.bool)
for i, input_char in enumerate(input_chars):
for j, ch in enumerate(input_char):
X[i, j, char2index[ch]] = 1
y[i, char2index[label_chars[i]]] = 1 # Build the model. We use a single RNN with a fully connected layer
# to compute the most likely predicted output char
HIDDEN_SIZE = 128
BATCH_SIZE = 128
NUM_ITERATIONS = 25
NUM_EPOCHS_PER_ITERATION = 1
NUM_PREDS_PER_EPOCH = 100 model = Sequential()
model.add(SimpleRNN(HIDDEN_SIZE, return_sequences=False,
input_shape=(SEQLEN, nb_chars),
unroll=True))
model.add(Dense(nb_chars))
model.add(Activation("softmax")) model.compile(loss="categorical_crossentropy", optimizer="rmsprop") # We train the model in batches and test output generated at each step
for iteration in range(NUM_ITERATIONS):
print("=" * 50)
print("Iteration #: %d" % (iteration))
model.fit(X, y, batch_size=BATCH_SIZE, epochs=NUM_EPOCHS_PER_ITERATION) # testing model
# randomly choose a row from input_chars, then use it to
# generate text from model for next 100 chars
test_idx = np.random.randint(len(input_chars))
test_chars = input_chars[test_idx]
print("Generating from seed: %s" % (test_chars))
print(test_chars, end="")
for i in range(NUM_PREDS_PER_EPOCH):
Xtest = np.zeros((1, SEQLEN, nb_chars))
for i, ch in enumerate(test_chars):
Xtest[0, i, char2index[ch]] = 1
pred = model.predict(Xtest, verbose=0)[0]
ypred = index2char[np.argmax(pred)]
print(ypred, end="")
# move forward with test_chars + ypred
test_chars = test_chars[1:] + ypred
print()

simpleRNN的更多相关文章

  1. 为什么使用 LSTM 训练速度远大于 SimpleRNN?

    今天试验 TensorFlow 2.x , Keras 的 SimpleRNN 和 LSTM,发现同样的输入.同样的超参数设置.同样的参数规模,LSTM 的训练时长竟然远少于 SimpleRNN. 模 ...

  2. Keras:基于Theano和TensorFlow的深度学习库

    catalogue . 引言 . 一些基本概念 . Sequential模型 . 泛型模型 . 常用层 . 卷积层 . 池化层 . 递归层Recurrent . 嵌入层 Embedding 1. 引言 ...

  3. “你什么意思”之基于RNN的语义槽填充(Pytorch实现)

    1. 概况 1.1 任务 口语理解(Spoken Language Understanding, SLU)作为语音识别与自然语言处理之间的一个新兴领域,其目的是为了让计算机从用户的讲话中理解他们的意图 ...

  4. (六) Keras 模型保存和RNN简单应用

    视频学习来源 https://www.bilibili.com/video/av40787141?from=search&seid=17003307842787199553 笔记 RNN用于图 ...

  5. 使用Keras搭建cnn+rnn, BRNN,DRNN等模型

    Keras api 提前知道: BatchNormalization, 用来加快每次迭代中的训练速度 Normalize the activations of the previous layer a ...

  6. Deep learning with Python 学习笔记(11)

    总结 机器学习(machine learning)是人工智能的一个特殊子领域,其目标是仅靠观察训练数据来自动开发程序[即模型(model)].将数据转换为程序的这个过程叫作学习(learning) 深 ...

  7. Word Embedding/RNN/LSTM

    Word Embedding Word Embedding是一种词的向量表示,比如,对于这样的"A B A C B F G"的一个序列,也许我们最后能得到:A对应的向量为[0.1 ...

  8. Keras学习笔记(完结)

    使用Keras中文文档学习 基本概念 Keras的核心数据结构是模型,也就是一种组织网络层的方式,最主要的是序贯模型(Sequential).创建好一个模型后就可以用add()向里面添加层.模型搭建完 ...

  9. Keras 中 TimeDistributed 和 TimeDistributedDense 理解

    From the offical code: class TimeDistributed(Wrapper): """This wrapper applies a laye ...

随机推荐

  1. 【带权并查集】poj1182 食物链

    带权并查集,或者叫做种类并查集,经典题. http://blog.csdn.net/shuangde800/article/details/7974668 这份代码感觉是坠吼的. 我的代码是暴力分类讨 ...

  2. Java高级架构师(一)第36节:Nginx的反向代理模块

    理解Http正向代理和Http反向代理的区别 Proxy模块,最常用的proxy_pass, Proxy_pass 可以转发请求到其他的浏览器.  # Nginx强项在于负载.反向.动静分离 #

  3. DoTA与人生

    一个dota菜鸟的人生感悟                接触Dota有了快3年之久,3年里可以经历很多东西,经历了很多东西之后就会有很多的感悟,有些感悟抽象的表达不出来,但是借助于dota,可以间接 ...

  4. iOS GCD NSOperation NSThread等多线程各种举例详解

    废话就不多说,直接上干货.如下图列举了很多多线程的知识点,每个按钮都写有对应的详细例子,并对运行结果进行分析,绝对拿实践结果来说话.如果各位道友发现错误之处还请指正.附上demo下载地址

  5. HDU 3360 National Treasures(二分匹配,最小点覆盖)

    National Treasures Time Limit: 2000/1000 MS (Java/Others)    Memory Limit: 32768/32768 K (Java/Other ...

  6. Promise小结

    Promise是异步里面的一种解决方案,解决了回调嵌套的问题,es6将其进行了语言标准,同意了用法,提供了`promise`对象, promise对象有三种状态:pending(进行中) .Resol ...

  7. 【shiro】使用shiro搭建的项目,页面引用js,报错:Uncaught SyntaxError: Unexpected token <

    使用shiro搭建项目过程中,总是出现登录页面 登录第一次有效果,登陆第二次出现302状态码,第三次又有效果,第四次又没有效果的局面. 因此,采用ajax提交页面登录的用户名和密码,但是在引用js的过 ...

  8. Android内存优化7 内存检测工具1 Memory Monitor检测内存泄露

    上篇说了一些性能优化的理论部分,主要是回顾一下,有了理论,小平同志又讲了,实践是检验真理的唯一标准,对于内存泄露的问题,现在通过Android Studio自带工具Memory Monitor 检测出 ...

  9. javascript 获取http头信息

    Javascript中跟response header有关的就两个方法: getResponseHeader 从响应信息中获取指定的http头 语法 strValue = oXMLHttpReques ...

  10. 在 Android Studio 2.2 中愉快地使用 C/C++

    转载请注明出处:http://blog.csdn.net/wl9739/article/details/52607010 注:官网上面的技术文章也在不断地汉化中,只是进度有点慢.在我翻译本篇文章的时候 ...