DL之DNN:利用MultiLayerNet模型【6*100+ReLU+SGD】对Mnist数据集训练来理解过拟合现象


北极帝企鹅
北极帝企鹅 2022-09-19 15:07:23 52518
分类专栏: 资讯

DL之DNN:利用MultiLayerNet模型【6*100+ReLU+SGD】对Mnist数据集训练来理解过拟合现象

导读
自定义少量的Mnist数据集,利用全连接神经网络MultiLayerNet模型【6*100+ReLU+SGD】进行训练,观察过拟合现象。

目录

输出结果

设计思路

核心代码

更多输出


输出结果

设计思路

核心代码

  1. for i in range(1000000):
  2. batch_mask = np.random.choice(train_size, batch_size)
  3. x_batch = x_train[batch_mask]
  4. t_batch = t_train[batch_mask]
  5. grads = network.gradient(x_batch, t_batch)
  6. optimizer.update(network.params, grads)
  7. if i % iter_per_epoch == 0:
  8. train_acc = network.accuracy(x_train, t_train)
  9. test_acc = network.accuracy(x_test, t_test)
  10. train_acc_list.append(train_acc)
  11. test_acc_list.append(test_acc)
  12. print("epoch:" + str(epoch_cnt) + ", train_acc:" + str(float('%.4f' % train_acc)) + ", test_acc:" + str(float('%.4f' % test_acc)))
  13. epoch_cnt += 1
  14. if epoch_cnt >= max_epochs:
  15. break

更多输出

  1. epoch:0, train_acc:0.0733, test_acc:0.0792
  2. epoch:1, train_acc:0.0767, test_acc:0.0878
  3. epoch:2, train_acc:0.0967, test_acc:0.0966
  4. epoch:3, train_acc:0.1, test_acc:0.1016
  5. epoch:4, train_acc:0.1133, test_acc:0.1065
  6. epoch:5, train_acc:0.1167, test_acc:0.1166
  7. epoch:6, train_acc:0.13, test_acc:0.1249
  8. epoch:7, train_acc:0.1567, test_acc:0.1348
  9. epoch:8, train_acc:0.1867, test_acc:0.1441
  10. epoch:9, train_acc:0.2067, test_acc:0.1602
  11. epoch:10, train_acc:0.2333, test_acc:0.1759
  12. epoch:11, train_acc:0.24, test_acc:0.1812
  13. epoch:12, train_acc:0.2567, test_acc:0.1963
  14. epoch:13, train_acc:0.2867, test_acc:0.2161
  15. epoch:14, train_acc:0.31, test_acc:0.2292
  16. epoch:15, train_acc:0.35, test_acc:0.2452
  17. epoch:16, train_acc:0.3567, test_acc:0.2609
  18. epoch:17, train_acc:0.3867, test_acc:0.2678
  19. epoch:18, train_acc:0.4, test_acc:0.2796
  20. epoch:19, train_acc:0.41, test_acc:0.291
  21. epoch:20, train_acc:0.42, test_acc:0.2978
  22. epoch:21, train_acc:0.4267, test_acc:0.3039
  23. epoch:22, train_acc:0.4433, test_acc:0.3122
  24. epoch:23, train_acc:0.4533, test_acc:0.3199
  25. epoch:24, train_acc:0.4633, test_acc:0.3252
  26. epoch:25, train_acc:0.47, test_acc:0.3326
  27. epoch:26, train_acc:0.4733, test_acc:0.3406
  28. epoch:27, train_acc:0.4733, test_acc:0.3506
  29. epoch:28, train_acc:0.4733, test_acc:0.3537
  30. epoch:29, train_acc:0.4867, test_acc:0.3582
  31. epoch:30, train_acc:0.4933, test_acc:0.3583
  32. epoch:31, train_acc:0.4967, test_acc:0.3655
  33. epoch:32, train_acc:0.4933, test_acc:0.3707
  34. epoch:33, train_acc:0.4967, test_acc:0.3722
  35. epoch:34, train_acc:0.5033, test_acc:0.3806
  36. epoch:35, train_acc:0.5133, test_acc:0.3776
  37. epoch:36, train_acc:0.51, test_acc:0.3804
  38. epoch:37, train_acc:0.5167, test_acc:0.3837
  39. epoch:38, train_acc:0.52, test_acc:0.3838
  40. epoch:39, train_acc:0.5167, test_acc:0.3844
  41. epoch:40, train_acc:0.5167, test_acc:0.3933
  42. epoch:41, train_acc:0.5233, test_acc:0.397
  43. epoch:42, train_acc:0.5267, test_acc:0.3967
  44. epoch:43, train_acc:0.5333, test_acc:0.4021
  45. epoch:44, train_acc:0.5267, test_acc:0.3961
  46. epoch:45, train_acc:0.5367, test_acc:0.3997
  47. epoch:46, train_acc:0.54, test_acc:0.4126
  48. epoch:47, train_acc:0.5533, test_acc:0.421
  49. epoch:48, train_acc:0.5533, test_acc:0.4274
  50. epoch:49, train_acc:0.5533, test_acc:0.4246
  51. epoch:50, train_acc:0.5633, test_acc:0.4322
  52. epoch:51, train_acc:0.5667, test_acc:0.4372
  53. epoch:52, train_acc:0.5867, test_acc:0.4544
  54. epoch:53, train_acc:0.6133, test_acc:0.4631
  55. epoch:54, train_acc:0.6167, test_acc:0.475
  56. epoch:55, train_acc:0.6167, test_acc:0.4756
  57. epoch:56, train_acc:0.6267, test_acc:0.4801
  58. epoch:57, train_acc:0.6333, test_acc:0.4822
  59. epoch:58, train_acc:0.62, test_acc:0.4809
  60. epoch:59, train_acc:0.63, test_acc:0.491
  61. epoch:60, train_acc:0.6233, test_acc:0.4939
  62. epoch:61, train_acc:0.6367, test_acc:0.501
  63. epoch:62, train_acc:0.65, test_acc:0.5156
  64. epoch:63, train_acc:0.65, test_acc:0.5192
  65. epoch:64, train_acc:0.65, test_acc:0.518
  66. epoch:65, train_acc:0.6367, test_acc:0.5204
  67. epoch:66, train_acc:0.6667, test_acc:0.527
  68. epoch:67, train_acc:0.6567, test_acc:0.533
  69. epoch:68, train_acc:0.6633, test_acc:0.5384
  70. epoch:69, train_acc:0.6733, test_acc:0.5374
  71. epoch:70, train_acc:0.67, test_acc:0.5365
  72. epoch:71, train_acc:0.69, test_acc:0.5454
  73. epoch:72, train_acc:0.68, test_acc:0.5479
  74. epoch:73, train_acc:0.6833, test_acc:0.553
  75. epoch:74, train_acc:0.6967, test_acc:0.5568
  76. epoch:75, train_acc:0.68, test_acc:0.55
  77. epoch:76, train_acc:0.7, test_acc:0.5567
  78. epoch:77, train_acc:0.71, test_acc:0.5617
  79. epoch:78, train_acc:0.7167, test_acc:0.5705
  80. epoch:79, train_acc:0.73, test_acc:0.5722
  81. epoch:80, train_acc:0.74, test_acc:0.5831
  82. epoch:81, train_acc:0.73, test_acc:0.5778
  83. epoch:82, train_acc:0.7567, test_acc:0.5845
  84. epoch:83, train_acc:0.7533, test_acc:0.587
  85. epoch:84, train_acc:0.75, test_acc:0.5809
  86. epoch:85, train_acc:0.7433, test_acc:0.5869
  87. epoch:86, train_acc:0.7533, test_acc:0.5996
  88. epoch:87, train_acc:0.75, test_acc:0.5963
  89. epoch:88, train_acc:0.7667, test_acc:0.6079
  90. epoch:89, train_acc:0.7733, test_acc:0.6247
  91. epoch:90, train_acc:0.7633, test_acc:0.6152
  92. epoch:91, train_acc:0.79, test_acc:0.6307
  93. epoch:92, train_acc:0.7967, test_acc:0.637
  94. epoch:93, train_acc:0.8033, test_acc:0.6351
  95. epoch:94, train_acc:0.8, test_acc:0.6464
  96. epoch:95, train_acc:0.7967, test_acc:0.6308
  97. epoch:96, train_acc:0.8067, test_acc:0.6406
  98. epoch:97, train_acc:0.8033, test_acc:0.6432
  99. epoch:98, train_acc:0.81, test_acc:0.657
  100. epoch:99, train_acc:0.81, test_acc:0.6523
  101. epoch:100, train_acc:0.8167, test_acc:0.6487
  102. epoch:101, train_acc:0.8033, test_acc:0.6532
  103. epoch:102, train_acc:0.8133, test_acc:0.672
  104. epoch:103, train_acc:0.8233, test_acc:0.6738
  105. epoch:104, train_acc:0.82, test_acc:0.6588
  106. epoch:105, train_acc:0.8167, test_acc:0.659
  107. epoch:106, train_acc:0.82, test_acc:0.6643
  108. epoch:107, train_acc:0.8233, test_acc:0.6696
  109. epoch:108, train_acc:0.8167, test_acc:0.6665
  110. epoch:109, train_acc:0.8133, test_acc:0.6523
  111. epoch:110, train_acc:0.83, test_acc:0.6744
  112. epoch:111, train_acc:0.8267, test_acc:0.6746
  113. epoch:112, train_acc:0.83, test_acc:0.6757
  114. epoch:113, train_acc:0.8267, test_acc:0.6749
  115. epoch:114, train_acc:0.8167, test_acc:0.668
  116. epoch:115, train_acc:0.8267, test_acc:0.6726
  117. epoch:116, train_acc:0.83, test_acc:0.6794
  118. epoch:117, train_acc:0.8167, test_acc:0.6632
  119. epoch:118, train_acc:0.8233, test_acc:0.6599
  120. epoch:119, train_acc:0.8267, test_acc:0.6692
  121. epoch:120, train_acc:0.83, test_acc:0.6695
  122. epoch:121, train_acc:0.8367, test_acc:0.6781
  123. epoch:122, train_acc:0.8333, test_acc:0.6689
  124. epoch:123, train_acc:0.8367, test_acc:0.6789
  125. epoch:124, train_acc:0.8333, test_acc:0.6821
  126. epoch:125, train_acc:0.8367, test_acc:0.6821
  127. epoch:126, train_acc:0.8267, test_acc:0.6742
  128. epoch:127, train_acc:0.8433, test_acc:0.6823
  129. epoch:128, train_acc:0.8367, test_acc:0.6828
  130. epoch:129, train_acc:0.8367, test_acc:0.6864
  131. epoch:130, train_acc:0.84, test_acc:0.674
  132. epoch:131, train_acc:0.84, test_acc:0.676
  133. epoch:132, train_acc:0.83, test_acc:0.6715
  134. epoch:133, train_acc:0.84, test_acc:0.6938
  135. epoch:134, train_acc:0.8333, test_acc:0.7013
  136. epoch:135, train_acc:0.84, test_acc:0.6979
  137. epoch:136, train_acc:0.84, test_acc:0.6822
  138. epoch:137, train_acc:0.84, test_acc:0.6929
  139. epoch:138, train_acc:0.8433, test_acc:0.6921
  140. epoch:139, train_acc:0.8433, test_acc:0.6963
  141. epoch:140, train_acc:0.83, test_acc:0.6976
  142. epoch:141, train_acc:0.84, test_acc:0.6897
  143. epoch:142, train_acc:0.8433, test_acc:0.6994
  144. epoch:143, train_acc:0.8467, test_acc:0.7042
  145. epoch:144, train_acc:0.8567, test_acc:0.6963
  146. epoch:145, train_acc:0.86, test_acc:0.6966
  147. epoch:146, train_acc:0.8533, test_acc:0.6813
  148. epoch:147, train_acc:0.85, test_acc:0.6891
  149. epoch:148, train_acc:0.8667, test_acc:0.6908
  150. epoch:149, train_acc:0.8467, test_acc:0.6719
  151. epoch:150, train_acc:0.85, test_acc:0.6783
  152. epoch:151, train_acc:0.86, test_acc:0.6969
  153. epoch:152, train_acc:0.86, test_acc:0.7071
  154. epoch:153, train_acc:0.8567, test_acc:0.6974
  155. epoch:154, train_acc:0.86, test_acc:0.7009
  156. epoch:155, train_acc:0.86, test_acc:0.6931
  157. epoch:156, train_acc:0.8567, test_acc:0.6946
  158. epoch:157, train_acc:0.86, test_acc:0.7004
  159. epoch:158, train_acc:0.86, test_acc:0.7023
  160. epoch:159, train_acc:0.85, test_acc:0.7054
  161. epoch:160, train_acc:0.8633, test_acc:0.6933
  162. epoch:161, train_acc:0.8667, test_acc:0.6872
  163. epoch:162, train_acc:0.86, test_acc:0.6844
  164. epoch:163, train_acc:0.8567, test_acc:0.6909
  165. epoch:164, train_acc:0.8633, test_acc:0.6884
  166. epoch:165, train_acc:0.87, test_acc:0.7005
  167. epoch:166, train_acc:0.8667, test_acc:0.6926
  168. epoch:167, train_acc:0.8633, test_acc:0.7131
  169. epoch:168, train_acc:0.86, test_acc:0.7068
  170. epoch:169, train_acc:0.87, test_acc:0.7045
  171. epoch:170, train_acc:0.8633, test_acc:0.7027
  172. epoch:171, train_acc:0.87, test_acc:0.6917
  173. epoch:172, train_acc:0.87, test_acc:0.7046
  174. epoch:173, train_acc:0.87, test_acc:0.71
  175. epoch:174, train_acc:0.8767, test_acc:0.714
  176. epoch:175, train_acc:0.87, test_acc:0.6925
  177. epoch:176, train_acc:0.8633, test_acc:0.7112
  178. epoch:177, train_acc:0.8733, test_acc:0.7149
  179. epoch:178, train_acc:0.8567, test_acc:0.7056
  180. epoch:179, train_acc:0.8633, test_acc:0.7149
  181. epoch:180, train_acc:0.8567, test_acc:0.6962
  182. epoch:181, train_acc:0.87, test_acc:0.7011
  183. epoch:182, train_acc:

网站声明:如果转载,请联系本站管理员。否则一切后果自行承担。

本文链接:https://www.xckfsq.com/news/show.html?id=3178
赞同 0
评论 0 条
北极帝企鹅L1
粉丝 0 发表 8 + 关注 私信
上周热门
如何使用 StarRocks 管理和优化数据湖中的数据?  2951
【软件正版化】软件正版化工作要点  2872
统信UOS试玩黑神话:悟空  2833
信刻光盘安全隔离与信息交换系统  2728
镜舟科技与中启乘数科技达成战略合作,共筑数据服务新生态  1261
grub引导程序无法找到指定设备和分区  1226
华为全联接大会2024丨软通动力分论坛精彩议程抢先看!  165
2024海洋能源产业融合发展论坛暨博览会同期活动-海洋能源与数字化智能化论坛成功举办  163
点击报名 | 京东2025校招进校行程预告  163
华为纯血鸿蒙正式版9月底见!但Mate 70的内情还得接着挖...  159
本周热议
我的信创开放社区兼职赚钱历程 40
今天你签到了吗? 27
如何玩转信创开放社区—从小白进阶到专家 15
信创开放社区邀请他人注册的具体步骤如下 15
方德桌面操作系统 14
用抖音玩法闯信创开放社区——用平台宣传企业产品服务 13
我有15积分有什么用? 13
如何让你先人一步获得悬赏问题信息?(创作者必看) 12
2024中国信创产业发展大会暨中国信息科技创新与应用博览会 9
中央国家机关政府采购中心:应当将CPU、操作系统符合安全可靠测评要求纳入采购需求 8

加入交流群

请使用微信扫一扫!