DL之LSTM:基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(层加深,基于keras)对单个character字符预测


cqyd
cqyd 2022-09-19 11:43:36 49394
分类专栏: 资讯

DL之LSTM:基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(层加深,基于keras)对单个character字符预测

目录

基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(层加深,基于keras)对单个character字符预测

设计思路

输出结果

核心代码


基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(层加深,基于keras)对单个character字符预测

设计思路

数据集下载https://download.csdn.net/download/qq_41185868/13767751

输出结果

  1. Using TensorFlow backend.
  2. F:\Program Files\Python\Python36\lib\site-packages\tensorflow\python\framework\dtypes.py:523: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
  3. _np_qint8 = np.dtype([("qint8", np.int8, 1)])
  4. F:\Program Files\Python\Python36\lib\site-packages\tensorflow\python\framework\dtypes.py:524: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
  5. _np_quint8 = np.dtype([("quint8", np.uint8, 1)])
  6. F:\Program Files\Python\Python36\lib\site-packages\tensorflow\python\framework\dtypes.py:525: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
  7. _np_qint16 = np.dtype([("qint16", np.int16, 1)])
  8. F:\Program Files\Python\Python36\lib\site-packages\tensorflow\python\framework\dtypes.py:526: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
  9. _np_quint16 = np.dtype([("quint16", np.uint16, 1)])
  10. F:\Program Files\Python\Python36\lib\site-packages\tensorflow\python\framework\dtypes.py:527: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
  11. _np_qint32 = np.dtype([("qint32", np.int32, 1)])
  12. F:\Program Files\Python\Python36\lib\site-packages\tensorflow\python\framework\dtypes.py:532: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
  13. np_resource = np.dtype([("resource", np.ubyte, 1)])
  14. [nltk_data] Error loading punkt: <urlopen error [Errno 11004]
  15. [nltk_data] getaddrinfo failed>
  16. raw_text[:10] : alice's ad
  17. Total Characters: 144413
  18. chars ['\n', ' ', '!', '"', "'", '(', ')', '*', ',', '-', '.', '0', '3', ':', ';', '?', '[', ']', '_', 'a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j', 'k', 'l', 'm', 'n', 'o', 'p', 'q', 'r', 's', 't', 'u', 'v', 'w', 'x', 'y', 'z']
  19. Total Vocab: 45
  20. sentences 1625 ["alice's adventures in wonderland\n\nlewis carroll\n\nthe millennium fulcrum edition 3.0\n\nchapter i. down the rabbit-hole\n\nalice was beginning to get very tired of sitting by her sister on the\nbank, and of having nothing to do: once or twice she had peeped into the\nbook her sister was reading, but it had no pictures or conversations in\nit, 'and what is the use of a book,' thought alice 'without pictures or\nconversations?'", 'so she was considering in her own mind (as well as she could, for the\nhot day made her feel very sleepy and stupid), whether the pleasure\nof making a daisy-chain would be worth the trouble of getting up and\npicking the daisies, when suddenly a white rabbit with pink eyes ran\nclose by her.', "there was nothing so very remarkable in that; nor did alice think it so\nvery much out of the way to hear the rabbit say to itself, 'oh dear!", 'oh dear!', "i shall be late!'"]
  21. lengths (1625,) [420 289 140 ... 636 553 7]
  22. CharMapInt_dict 45 {'\n': 0, ' ': 1, '!': 2, '"': 3, "'": 4, '(': 5, ')': 6, '*': 7, ',': 8, '-': 9, '.': 10, '0': 11, '3': 12, ':': 13, ';': 14, '?': 15, '[': 16, ']': 17, '_': 18, 'a': 19, 'b': 20, 'c': 21, 'd': 22, 'e': 23, 'f': 24, 'g': 25, 'h': 26, 'i': 27, 'j': 28, 'k': 29, 'l': 30, 'm': 31, 'n': 32, 'o': 33, 'p': 34, 'q': 35, 'r': 36, 's': 37, 't': 38, 'u': 39, 'v': 40, 'w': 41, 'x': 42, 'y': 43, 'z': 44}
  23. IntMapChar_dict 45 {0: '\n', 1: ' ', 2: '!', 3: '"', 4: "'", 5: '(', 6: ')', 7: '*', 8: ',', 9: '-', 10: '.', 11: '0', 12: '3', 13: ':', 14: ';', 15: '?', 16: '[', 17: ']', 18: '_', 19: 'a', 20: 'b', 21: 'c', 22: 'd', 23: 'e', 24: 'f', 25: 'g', 26: 'h', 27: 'i', 28: 'j', 29: 'k', 30: 'l', 31: 'm', 32: 'n', 33: 'o', 34: 'p', 35: 'q', 36: 'r', 37: 's', 38: 't', 39: 'u', 40: 'v', 41: 'w', 42: 'x', 43: 'y', 44: 'z'}
  24. dataX: 144313 100 [[19, 30, 27, 21, 23, 4, 37, 1, 19, 22, 40, 23, 32, 38, 39, 36, 23, 37, 1, 27, 32, 1, 41, 33, 32, 22, 23, 36, 30, 19, 32, 22, 0, 0, 30, 23, 41, 27, 37, 1, 21, 19, 36, 36, 33, 30, 30, 0, 0, 38, 26, 23, 1, 31, 27, 30, 30, 23, 32, 32, 27, 39, 31, 1, 24, 39, 30, 21, 36, 39, 31, 1, 23, 22, 27, 38, 27, 33, 32, 1, 12, 10, 11, 0, 0, 21, 26, 19, 34, 38, 23, 36, 1, 27, 10, 1, 22, 33, 41, 32], [30, 27, 21, 23, 4, 37, 1, 19, 22, 40, 23, 32, 38, 39, 36, 23, 37, 1, 27, 32, 1, 41, 33, 32, 22, 23, 36, 30, 19, 32, 22, 0, 0, 30, 23, 41, 27, 37, 1, 21, 19, 36, 36, 33, 30, 30, 0, 0, 38, 26, 23, 1, 31, 27, 30, 30, 23, 32, 32, 27, 39, 31, 1, 24, 39, 30, 21, 36, 39, 31, 1, 23, 22, 27, 38, 27, 33, 32, 1, 12, 10, 11, 0, 0, 21, 26, 19, 34, 38, 23, 36, 1, 27, 10, 1, 22, 33, 41, 32, 1], [27, 21, 23, 4, 37, 1, 19, 22, 40, 23, 32, 38, 39, 36, 23, 37, 1, 27, 32, 1, 41, 33, 32, 22, 23, 36, 30, 19, 32, 22, 0, 0, 30, 23, 41, 27, 37, 1, 21, 19, 36, 36, 33, 30, 30, 0, 0, 38, 26, 23, 1, 31, 27, 30, 30, 23, 32, 32, 27, 39, 31, 1, 24, 39, 30, 21, 36, 39, 31, 1, 23, 22, 27, 38, 27, 33, 32, 1, 12, 10, 11, 0, 0, 21, 26, 19, 34, 38, 23, 36, 1, 27, 10, 1, 22, 33, 41, 32, 1, 38], [21, 23, 4, 37, 1, 19, 22, 40, 23, 32, 38, 39, 36, 23, 37, 1, 27, 32, 1, 41, 33, 32, 22, 23, 36, 30, 19, 32, 22, 0, 0, 30, 23, 41, 27, 37, 1, 21, 19, 36, 36, 33, 30, 30, 0, 0, 38, 26, 23, 1, 31, 27, 30, 30, 23, 32, 32, 27, 39, 31, 1, 24, 39, 30, 21, 36, 39, 31, 1, 23, 22, 27, 38, 27, 33, 32, 1, 12, 10, 11, 0, 0, 21, 26, 19, 34, 38, 23, 36, 1, 27, 10, 1, 22, 33, 41, 32, 1, 38, 26], [23, 4, 37, 1, 19, 22, 40, 23, 32, 38, 39, 36, 23, 37, 1, 27, 32, 1, 41, 33, 32, 22, 23, 36, 30, 19, 32, 22, 0, 0, 30, 23, 41, 27, 37, 1, 21, 19, 36, 36, 33, 30, 30, 0, 0, 38, 26, 23, 1, 31, 27, 30, 30, 23, 32, 32, 27, 39, 31, 1, 24, 39, 30, 21, 36, 39, 31, 1, 23, 22, 27, 38, 27, 33, 32, 1, 12, 10, 11, 0, 0, 21, 26, 19, 34, 38, 23, 36, 1, 27, 10, 1, 22, 33, 41, 32, 1, 38, 26, 23]]
  25. dataY: 144313 [1, 38, 26, 23, 1]
  26. Total patterns: 144313
  27. X_train.shape (144313, 100, 1)
  28. Y_train.shape (144313, 45)
  29. Init data,after read_out, chars:
  30. 144313 alice's adventures in wonderland
  31. lewis carroll
  32. tge millennium fulcrum edition 3.0
  33. cgapter i. down
  34. _________________________________________________________________
  35. Layer (type) Output Shape Param
  36. =================================================================
  37. F:\File_Jupyter\实用代码\NeuralNetwork(神经网络)\CharacterLanguageLSTM.py:135: UserWarning: The `nb_epoch` argument in `fit` has been renamed `epochs`.
  38. LSTM_Model.fit(X_train[:train_index], Y_train[:train_index], nb_epoch=10, batch_size=64, callbacks=callbacks_list)
  39. lstm_1 (LSTM) (None, 256) 264192
  40. _________________________________________________________________
  41. dropout_1 (Dropout) (None, 256) 0
  42. _________________________________________________________________
  43. dense_1 (Dense) (None, 45) 11565
  44. =================================================================
  45. Total params: 275,757
  46. Trainable params: 275,757
  47. Non-trainable params: 0
  48. _________________________________________________________________
  49. LSTM_Model
  50. None
  51. Epoch 1/10
  52. 2020-12-23 23:42:07.919094: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2
  53. 64/1000 [>.............................] - ETA: 29s - loss: 3.8086
  54. 128/1000 [==>...........................] - ETA: 15s - loss: 3.7953
  55. 192/1000 [====>.........................] - ETA: 11s - loss: 3.7823
  56. 256/1000 [======>.......................] - ETA: 8s - loss: 3.7692
  57. 320/1000 [========>.....................] - ETA: 7s - loss: 3.7552
  58. 384/1000 [==========>...................] - ETA: 5s - loss: 3.7372
  59. 448/1000 [============>.................] - ETA: 4s - loss: 3.7026
  60. 512/1000 [==============>...............] - ETA: 4s - loss: 3.6552
  61. 576/1000 [================>.............] - ETA: 3s - loss: 3.5955
  62. 640/1000 [==================>...........] - ETA: 2s - loss: 3.5678
  63. 704/1000 [====================>.........] - ETA: 2s - loss: 3.5116
  64. 768/1000 [======================>.......] - ETA: 1s - loss: 3.4778
  65. 832/1000 [=======================>......] - ETA: 1s - loss: 3.4441
  66. 896/1000 [=========================>....] - ETA: 0s - loss: 3.4278
  67. 960/1000 [===========================>..] - ETA: 0s - loss: 3.4092
  68. 1000/1000 [==============================] - 7s 7ms/step - loss: 3.3925
  69. Epoch 00001: loss improved from inf to 3.39249, saving model to hdf5/weights-improvement-01-3.3925.hdf5
  70. Epoch 2/10
  71. 64/1000 [>.............................] - ETA: 4s - loss: 3.1429
  72. 128/1000 [==>...........................] - ETA: 4s - loss: 3.1370
  73. 192/1000 [====>.........................] - ETA: 3s - loss: 3.1034
  74. 256/1000 [======>.......................] - ETA: 3s - loss: 3.1038
  75. 320/1000 [========>.....................] - ETA: 3s - loss: 3.0962
  76. 384/1000 [==========>...................] - ETA: 2s - loss: 3.1055
  77. 448/1000 [============>.................] - ETA: 2s - loss: 3.0986
  78. 512/1000 [==============>...............] - ETA: 2s - loss: 3.0628
  79. 576/1000 [================>.............] - ETA: 2s - loss: 3.0452
  80. 640/1000 [==================>...........] - ETA: 1s - loss: 3.0571
  81. 704/1000 [====================>.........] - ETA: 1s - loss: 3.0684
  82. 768/1000 [======================>.......] - ETA: 1s - loss: 3.0606
  83. 832/1000 [=======================>......] - ETA: 0s - loss: 3.0596
  84. 896/1000 [=========================>....] - ETA: 0s - loss: 3.0529
  85. 960/1000 [===========================>..] - ETA: 0s - loss: 3.0484
  86. 1000/1000 [==============================] - 5s 5ms/step - loss: 3.0371
  87. Epoch 00002: loss improved from 3.39249 to 3.03705, saving model to hdf5/weights-improvement-02-3.0371.hdf5
  88. Epoch 3/10
  89. 64/1000 [>.............................] - ETA: 4s - loss: 3.1671
  90. 128/1000 [==>...........................] - ETA: 4s - loss: 3.0008
  91. 192/1000 [====>.........................] - ETA: 4s - loss: 3.0159
  92. 256/1000 [======>.......................] - ETA: 4s - loss: 3.0019
  93. 320/1000 [========>.....................] - ETA: 3s - loss: 3.0056
  94. 384/1000 [==========>...................] - ETA: 3s - loss: 3.0156
  95. 448/1000 [============>.................] - ETA: 2s - loss: 3.0392
  96. 512/1000 [==============>...............] - ETA: 2s - loss: 3.0243
  97. 576/1000 [================>.............] - ETA: 2s - loss: 3.0226
  98. 640/1000 [==================>...........] - ETA: 1s - loss: 3.0162
  99. 704/1000 [====================>.........] - ETA: 1s - loss: 3.0238
  100. 768/1000 [======================>.......] - ETA: 1s - loss: 3.0195
  101. 832/1000 [=======================>......] - ETA: 0s - loss: 3.0286
  102. 896/1000 [=========================>....] - ETA: 0s - loss: 3.0272
  103. 960/1000 [===========================>..] - ETA: 0s - loss: 3.0214
  104. 1000/1000 [==============================] - 6s 6ms/step - loss: 3.0225
  105. Epoch 00003: loss improved from 3.03705 to 3.02249, saving model to hdf5/weights-improvement-03-3.0225.hdf5
  106. Epoch 4/10
  107. 64/1000 [>.............................] - ETA: 5s - loss: 2.7843
  108. 128/1000 [==>...........................] - ETA: 5s - loss: 2.8997
  109. 192/1000 [====>.........................] - ETA: 4s - loss: 2.9975
  110. 256/1000 [======>.......................] - ETA: 4s - loss: 3.0150
  111. 320/1000 [========>.....................] - ETA: 3s - loss: 3.0025
  112. 384/1000 [==========>...................] - ETA: 3s - loss: 3.0442
  113. 448/1000 [============>.................] - ETA: 3s - loss: 3.0494
  114. 512/1000 [==============>...............] - ETA: 2s - loss: 3.0398
  115. 576/1000 [================>.............] - ETA: 2s - loss: 3.0170
  116. 640/1000 [==================>...........] - ETA: 2s - loss: 3.0421
  117. 704/1000 [====================>.........] - ETA: 1s - loss: 3.0366
  118. 768/1000 [======================>.......] - ETA: 1s - loss: 3.0339
  119. 832/1000 [=======================>......] - ETA: 0s - loss: 3.0316
  120. 896/1000 [=========================>....] - ETA: 0s - loss: 3.0361
  121. 960/1000 [===========================>..] - ETA: 0s - loss: 3.0326
  122. 1000/1000 [==============================] - 6s 6ms/step - loss: 3.0352
  123. Epoch 00004: loss did not improve from 3.02249
  124. Epoch 5/10
  125. 64/1000 [>.............................] - ETA: 4s - loss: 2.8958
  126. 128/1000 [==>...........................] - ETA: 4s - loss: 2.9239
  127. 192/1000 [====>.........................] - ETA: 4s - loss: 2.9044
  128. 256/1000 [======>.......................] - ETA: 4s - loss: 2.9417
  129. 320/1000 [========>.....................] - ETA: 3s - loss: 2.9674
  130. 384/1000 [==========>...................] - ETA: 3s - loss: 2.9646
  131. 448/1000 [============>.................] - ETA: 3s - loss: 2.9629
  132. 512/1000 [==============>...............] - ETA: 2s - loss: 2.9707
  133. 576/1000 [================>.............] - ETA: 2s - loss: 2.9699
  134. 640/1000 [==================>...........] - ETA: 1s - loss: 2.9594
  135. 704/1000 [====================>.........] - ETA: 1s - loss: 2.9830
  136. 768/1000 [======================>.......] - ETA: 1s - loss: 2.9773
  137. 832/1000 [=======================>......] - ETA: 0s - loss: 2.9774
  138. 896/1000 [=========================>....] - ETA: 0s - loss: 2.9891
  139. 960/1000 [===========================>..] - ETA: 0s - loss: 3.0070
  140. 1000/1000 [==============================] - 5s 5ms/step - loss: 3.0120
  141. Epoch 00005: loss improved from 3.02249 to 3.01205, saving model to hdf5/weights-improvement-05-3.0120.hdf5
  142. Epoch 6/10
  143. 64/1000 [>.............................] - ETA: 4s - loss: 3.0241
  144. 128/1000 [==>...........................] - ETA: 4s - loss: 3.0463
  145. 192/1000 [====>.........................] - ETA: 3s - loss: 3.0364
  146. 256/1000 [======>.......................] - ETA: 3s - loss: 2.9712
  147. 320/1000 [========>.....................] - ETA: 3s - loss: 2.9840
  148. 384/1000 [==========>...................] - ETA: 3s - loss: 2.9887
  149. 448/1000 [============>.................] - ETA: 2s - loss: 2.9785
  150. 512/1000 [==============>...............] - ETA: 2s - loss: 2.9852
  151. 576/1000 [================>.............] - ETA: 2s - loss: 2.9893
  152. 640/1000 [==================>...........] - ETA: 1s - loss: 2.9931
  153. 704/1000 [====================>.........] - ETA: 1s - loss: 2.9790
  154. 768/1000 [======================>.......] - ETA: 1s - loss: 2.9962
  155. 832/1000 [=======================>......] - ETA: 0s - loss: 3.0166
  156. 896/1000 [=========================>....] - ETA: 0s - loss: 3.0213
  157. 960/1000 [===========================>..] - ETA: 0s - loss: 3.0143
  158. 1000/1000 [==============================] - 5s 5ms/step - loss: 3.0070
  159. Epoch 00006: loss improved from 3.01205 to 3.00701, saving model to hdf5/weights-improvement-06-3.0070.hdf5
  160. Epoch 7/10
  161. 64/1000 [>.............................] - ETA: 5s - loss: 3.0738
  162. 128/1000 [==>...........................] - ETA: 5s - loss: 3.0309
  163. 192/1000 [====>.........................] - ETA: 4s - loss: 2.9733
  164. 256/1000 [======>.......................] - ETA: 4s - loss: 2.9728
  165. 320/1000 [========>.....................] - ETA: 4s - loss: 2.9422
  166. 384/1000 [==========>...................] - ETA: 3s - loss: 2.9496
  167. 448/1000 [============>.................] - ETA: 3s - loss: 2.9548
  168. 512/1000 [==============>...............] - ETA: 3s - loss: 2.9635
  169. 576/1000 [================>.............] - ETA: 2s - loss: 2.9614
  170. 640/1000 [==================>...........] - ETA: 2s - loss: 2.9537
  171. 704/1000 [====================>.........] - ETA: 1s - loss: 2.9454
  172. 768/1000 [======================>.......] - ETA: 1s - loss: 2.9649
  173. 832/1000 [=======================>......] - ETA: 1s - loss: 2.9814
  174. 896/1000 [=========================>....] - ETA: 0s - loss: 2.9955
  175. 960/1000 [===========================>..] - ETA: 0s - loss: 2.9948
  176. 1000/1000 [==============================] - 6s 6ms/step - loss: 2.9903
  177. Epoch 00007: loss improved from 3.00701 to 2.99027, saving model to hdf5/weights-improvement-07-2.9903.hdf5
  178. Epoch 8/10

网站声明:如果转载,请联系本站管理员。否则一切后果自行承担。

本文链接:https://www.xckfsq.com/news/show.html?id=2137
赞同 0
评论 0 条
cqydL0
粉丝 0 发表 10 + 关注 私信
上周热门
如何使用 StarRocks 管理和优化数据湖中的数据?  2935
【软件正版化】软件正版化工作要点  2854
统信UOS试玩黑神话:悟空  2811
信刻光盘安全隔离与信息交换系统  2702
镜舟科技与中启乘数科技达成战略合作,共筑数据服务新生态  1235
grub引导程序无法找到指定设备和分区  1205
点击报名 | 京东2025校招进校行程预告  162
华为全联接大会2024丨软通动力分论坛精彩议程抢先看!  160
2024海洋能源产业融合发展论坛暨博览会同期活动-海洋能源与数字化智能化论坛成功举办  156
金山办公2024算法挑战赛 | 报名截止日期更新  153
本周热议
我的信创开放社区兼职赚钱历程 40
今天你签到了吗? 27
信创开放社区邀请他人注册的具体步骤如下 15
如何玩转信创开放社区—从小白进阶到专家 15
方德桌面操作系统 14
我有15积分有什么用? 13
用抖音玩法闯信创开放社区——用平台宣传企业产品服务 13
如何让你先人一步获得悬赏问题信息?(创作者必看) 12
2024中国信创产业发展大会暨中国信息科技创新与应用博览会 9
中央国家机关政府采购中心:应当将CPU、操作系统符合安全可靠测评要求纳入采购需求 8

加入交流群

请使用微信扫一扫!