DL之DNN:利用MultiLayerNet模型【6*100+ReLU+SGD,weight_decay】对Mnist数据集训练来抑制过拟合


鸡翅无心
鸡翅无心 2022-09-19 15:07:14 51673
分类专栏: 资讯

DL之DNN:利用MultiLayerNet模型【6*100+ReLU+SGD,weight_decay】对Mnist数据集训练来抑制过拟合

目录

输出结果

设计思路

核心代码

更多输出


输出结果

设计思路

核心代码

  1. weight_decay_lambda = 0
  2. weight_decay_lambda = 0.1
  3. for i in range(1000000):
  4. batch_mask = np.random.choice(train_size, batch_size)
  5. x_batch = x_train[batch_mask]
  6. t_batch = t_train[batch_mask]
  7. grads = network.gradient(x_batch, t_batch)
  8. optimizer.update(network.params, grads)
  9. if i % iter_per_epoch == 0:
  10. train_acc = network.accuracy(x_train, t_train)
  11. test_acc = network.accuracy(x_test, t_test)
  12. train_acc_list.append(train_acc)
  13. test_acc_list.append(test_acc)
  14. print("epoch:" + str(epoch_cnt) + ", train_acc:" + str(float('%.4f' % train_acc)) + ", test_acc:" + str(float('%.4f' % test_acc)))
  15. epoch_cnt += 1
  16. if epoch_cnt >= max_epochs:
  17. break

更多输出

1、MultiLayerNet[6*100+ReLU+SGD]: DIY Overfitting Data Set based on mnist,train_acc VS test_acc

  1. epoch:0, train_acc:0.06, test_acc:0.0834
  2. epoch:1, train_acc:0.1233, test_acc:0.1109
  3. epoch:2, train_acc:0.1467, test_acc:0.1292
  4. epoch:3, train_acc:0.2233, test_acc:0.1717
  5. epoch:4, train_acc:0.2567, test_acc:0.1891
  6. epoch:5, train_acc:0.27, test_acc:0.2181
  7. epoch:6, train_acc:0.31, test_acc:0.229
  8. epoch:7, train_acc:0.32, test_acc:0.24
  9. epoch:8, train_acc:0.3567, test_acc:0.2502
  10. epoch:9, train_acc:0.37, test_acc:0.2651
  11. epoch:10, train_acc:0.3767, test_acc:0.2743
  12. epoch:11, train_acc:0.39, test_acc:0.2833
  13. epoch:12, train_acc:0.3767, test_acc:0.2769
  14. epoch:13, train_acc:0.4067, test_acc:0.295
  15. epoch:14, train_acc:0.4667, test_acc:0.3169
  16. epoch:15, train_acc:0.45, test_acc:0.3213
  17. epoch:16, train_acc:0.5067, test_acc:0.3439
  18. epoch:17, train_acc:0.54, test_acc:0.3593
  19. epoch:18, train_acc:0.5233, test_acc:0.3687
  20. epoch:19, train_acc:0.5367, test_acc:0.3691
  21. epoch:20, train_acc:0.5667, test_acc:0.4051
  22. epoch:21, train_acc:0.5967, test_acc:0.4265
  23. epoch:22, train_acc:0.63, test_acc:0.4477
  24. epoch:23, train_acc:0.6467, test_acc:0.4627
  25. epoch:24, train_acc:0.6567, test_acc:0.4708
  26. epoch:25, train_acc:0.6533, test_acc:0.4896
  27. epoch:26, train_acc:0.66, test_acc:0.5034
  28. epoch:27, train_acc:0.68, test_acc:0.5107
  29. epoch:28, train_acc:0.6833, test_acc:0.5083
  30. epoch:29, train_acc:0.7067, test_acc:0.5244
  31. epoch:30, train_acc:0.7567, test_acc:0.5564
  32. epoch:31, train_acc:0.7333, test_acc:0.5411
  33. epoch:32, train_acc:0.7533, test_acc:0.5698
  34. epoch:33, train_acc:0.7633, test_acc:0.5738
  35. epoch:34, train_acc:0.7833, test_acc:0.5764
  36. epoch:35, train_acc:0.7633, test_acc:0.5863
  37. epoch:36, train_acc:0.7733, test_acc:0.5915
  38. epoch:37, train_acc:0.8067, test_acc:0.608
  39. epoch:38, train_acc:0.81, test_acc:0.6113
  40. epoch:39, train_acc:0.8033, test_acc:0.5922
  41. epoch:40, train_acc:0.8233, test_acc:0.6192
  42. epoch:41, train_acc:0.83, test_acc:0.6203
  43. epoch:42, train_acc:0.8033, test_acc:0.6066
  44. epoch:43, train_acc:0.8333, test_acc:0.6311
  45. epoch:44, train_acc:0.8433, test_acc:0.6273
  46. epoch:45, train_acc:0.85, test_acc:0.6413
  47. epoch:46, train_acc:0.85, test_acc:0.6375
  48. epoch:47, train_acc:0.86, test_acc:0.6352
  49. epoch:48, train_acc:0.8667, test_acc:0.6504
  50. epoch:49, train_acc:0.8767, test_acc:0.6588
  51. epoch:50, train_acc:0.8667, test_acc:0.6592
  52. epoch:51, train_acc:0.89, test_acc:0.6648
  53. epoch:52, train_acc:0.88, test_acc:0.6605
  54. epoch:53, train_acc:0.88, test_acc:0.6654
  55. epoch:54, train_acc:0.8967, test_acc:0.6674
  56. epoch:55, train_acc:0.8967, test_acc:0.6701
  57. epoch:56, train_acc:0.9, test_acc:0.6636
  58. epoch:57, train_acc:0.9, test_acc:0.6755
  59. epoch:58, train_acc:0.9167, test_acc:0.6763
  60. epoch:59, train_acc:0.9133, test_acc:0.6748
  61. epoch:60, train_acc:0.92, test_acc:0.6788
  62. epoch:61, train_acc:0.9033, test_acc:0.6759
  63. epoch:62, train_acc:0.9133, test_acc:0.6747
  64. epoch:63, train_acc:0.9233, test_acc:0.6915
  65. epoch:64, train_acc:0.9267, test_acc:0.687
  66. epoch:65, train_acc:0.92, test_acc:0.6822
  67. epoch:66, train_acc:0.9133, test_acc:0.6827
  68. epoch:67, train_acc:0.92, test_acc:0.6932
  69. epoch:68, train_acc:0.9333, test_acc:0.6976
  70. epoch:69, train_acc:0.94, test_acc:0.6953
  71. epoch:70, train_acc:0.94, test_acc:0.7031
  72. epoch:71, train_acc:0.9367, test_acc:0.6951
  73. epoch:72, train_acc:0.9433, test_acc:0.7036
  74. epoch:73, train_acc:0.9367, test_acc:0.7051
  75. epoch:74, train_acc:0.9433, test_acc:0.706
  76. epoch:75, train_acc:0.95, test_acc:0.707
  77. epoch:76, train_acc:0.9567, test_acc:0.7052
  78. epoch:77, train_acc:0.9433, test_acc:0.6991
  79. epoch:78, train_acc:0.9567, test_acc:0.7121
  80. epoch:79, train_acc:0.9633, test_acc:0.7055
  81. epoch:80, train_acc:0.96, test_acc:0.7088
  82. epoch:81, train_acc:0.9567, test_acc:0.7105
  83. epoch:82, train_acc:0.9633, test_acc:0.7091
  84. epoch:83, train_acc:0.9567, test_acc:0.7159
  85. epoch:84, train_acc:0.9567, test_acc:0.7072
  86. epoch:85, train_acc:0.9633, test_acc:0.7138
  87. epoch:86, train_acc:0.9767, test_acc:0.7127
  88. epoch:87, train_acc:0.9733, test_acc:0.7167
  89. epoch:88, train_acc:0.9733, test_acc:0.7241
  90. epoch:89, train_acc:0.98, test_acc:0.721
  91. epoch:90, train_acc:0.9767, test_acc:0.7202
  92. epoch:91, train_acc:0.9767, test_acc:0.7232
  93. epoch:92, train_acc:0.9833, test_acc:0.717
  94. epoch:93, train_acc:0.9867, test_acc:0.7215
  95. epoch:94, train_acc:0.9867, test_acc:0.7299
  96. epoch:95, train_acc:0.9833, test_acc:0.728
  97. epoch:96, train_acc:0.99, test_acc:0.7223
  98. epoch:97, train_acc:0.9867, test_acc:0.7205
  99. epoch:98, train_acc:0.99, test_acc:0.7287
  100. epoch:99, train_acc:0.9967, test_acc:0.7298
  101. epoch:100, train_acc:0.99, test_acc:0.7288
  102. epoch:101, train_acc:1.0, test_acc:0.7258
  103. epoch:102, train_acc:0.9967, test_acc:0.7274
  104. epoch:103, train_acc:0.9967, test_acc:0.7238
  105. epoch:104, train_acc:1.0, test_acc:0.7275
  106. epoch:105, train_acc:0.9967, test_acc:0.7275
  107. epoch:106, train_acc:1.0, test_acc:0.7209
  108. epoch:107, train_acc:1.0, test_acc:0.7306
  109. epoch:108, train_acc:0.9933, test_acc:0.7267
  110. epoch:109, train_acc:0.9967, test_acc:0.7278
  111. epoch:110, train_acc:1.0, test_acc:0.7306
  112. epoch:111, train_acc:1.0, test_acc:0.7279
  113. epoch:112, train_acc:0.9967, test_acc:0.7326
  114. epoch:113, train_acc:0.9967, test_acc:0.7274
  115. epoch:114, train_acc:0.9967, test_acc:0.7279
  116. epoch:115, train_acc:1.0, test_acc:0.7301
  117. epoch:116, train_acc:1.0, test_acc:0.7296
  118. epoch:117, train_acc:1.0, test_acc:0.7327
  119. epoch:118, train_acc:1.0, test_acc:0.7248
  120. epoch:119, train_acc:1.0, test_acc:0.733
  121. epoch:120, train_acc:1.0, test_acc:0.7286
  122. epoch:121, train_acc:1.0, test_acc:0.7302
  123. epoch:122, train_acc:1.0, test_acc:0.7346
  124. epoch:123, train_acc:1.0, test_acc:0.7309
  125. epoch:124, train_acc:1.0, test_acc:0.7309
  126. epoch:125, train_acc:1.0, test_acc:0.7327
  127. epoch:126, train_acc:1.0, test_acc:0.7353
  128. epoch:127, train_acc:1.0, test_acc:0.7316
  129. epoch:128, train_acc:1.0, test_acc:0.7296
  130. epoch:129, train_acc:1.0, test_acc:0.731
  131. epoch:130, train_acc:1.0, test_acc:0.733
  132. epoch:131, train_acc:1.0, test_acc:0.7331
  133. epoch:132, train_acc:1.0, test_acc:0.732
  134. epoch:133, train_acc:1.0, test_acc:0.7333
  135. epoch:134, train_acc:1.0, test_acc:0.7288
  136. epoch:135, train_acc:1.0, test_acc:0.7347
  137. epoch:136, train_acc:1.0, test_acc:0.7349
  138. epoch:137, train_acc:1.0, test_acc:0.7356
  139. epoch:138, train_acc:1.0, test_acc:0.7308
  140. epoch:139, train_acc:1.0, test_acc:0.7359
  141. epoch:140, train_acc:1.0, test_acc:0.7337
  142. epoch:141, train_acc:1.0, test_acc:0.7355
  143. epoch:142, train_acc:1.0, test_acc:0.7349
  144. epoch:143, train_acc:1.0, test_acc:0.7327
  145. epoch:144, train_acc:1.0, test_acc:0.7344
  146. epoch:145, train_acc:1.0, test_acc:0.7367
  147. epoch:146, train_acc:1.0, test_acc:0.7372
  148. epoch:147, train_acc:1.0, test_acc:0.7353
  149. epoch:148, train_acc:1.0, test_acc:0.7373
  150. epoch:149, train_acc:1.0, test_acc:0.7362
  151. epoch:150, train_acc:1.0, test_acc:0.7366
  152. epoch:151, train_acc:1.0, test_acc:0.7376
  153. epoch:152, train_acc:1.0, test_acc:0.7357
  154. epoch:153, train_acc:1.0, test_acc:0.7341
  155. epoch:154, train_acc:1.0, test_acc:0.7338
  156. epoch:155, train_acc:1.0, test_acc:0.7351
  157. epoch:156, train_acc:1.0, test_acc:0.7339
  158. epoch:157, train_acc:1.0, test_acc:0.7383
  159. epoch:158, train_acc:1.0, test_acc:0.7366
  160. epoch:159, train_acc:1.0, test_acc:0.7376
  161. epoch:160, train_acc:1.0, test_acc:0.7383
  162. epoch:161, train_acc:1.0, test_acc:0.7404
  163. epoch:162, train_acc:1.0, test_acc:0.7373
  164. epoch:163, train_acc:1.0, test_acc:0.7357
  165. epoch:164, train_acc:1.0, test_acc:0.7359
  166. epoch:165, train_acc:1.0, test_acc:0.7392
  167. epoch:166, train_acc:1.0, test_acc:0.7384
  168. epoch:167, train_acc:1.0, test_acc:0.7381
  169. epoch:168, train_acc:1.0, test_acc:0.734
  170. epoch:169, train_acc:1.0, test_acc:0.7352
  171. epoch:170, train_acc:1.0, test_acc:0.7356
  172. epoch:171, train_acc:1.0, test_acc:0.7381
  173. epoch:172, train_acc:1.0, test_acc:0.7384
  174. epoch:173, train_acc:1.0, test_acc:0.7398
  175. epoch:174, train_acc:1.0, test_acc:0.7395
  176. epoch:175, train_acc:1.0, test_acc:0.7413
  177. epoch:176, train_acc:1.0, test_acc:0.7387
  178. epoch:177, train_acc:1.0, test_acc:0.7402
  179. epoch:178, train_acc:1.0, test_acc:0.7378
  180. epoch:179, train_acc:1.0, test_acc:0.7389
  181. epoch:180, train_acc:1.0, test_acc:0.7396
  182. epoch:181, train_acc:1.0, test_acc:0.7375
  183. epoch:182, train_acc:1.0, test_acc:0.7403
  184. epoch:183, train_acc:1.0, test_acc:0.7392

网站声明:如果转载,请联系本站管理员。否则一切后果自行承担。

本文链接:https://www.xckfsq.com/news/show.html?id=3177
赞同 0
评论 0 条
鸡翅无心L0
粉丝 0 发表 5 + 关注 私信
上周热门
如何使用 StarRocks 管理和优化数据湖中的数据?  2959
【软件正版化】软件正版化工作要点  2878
统信UOS试玩黑神话:悟空  2843
信刻光盘安全隔离与信息交换系统  2737
镜舟科技与中启乘数科技达成战略合作,共筑数据服务新生态  1270
grub引导程序无法找到指定设备和分区  1235
华为全联接大会2024丨软通动力分论坛精彩议程抢先看!  165
点击报名 | 京东2025校招进校行程预告  164
2024海洋能源产业融合发展论坛暨博览会同期活动-海洋能源与数字化智能化论坛成功举办  163
华为纯血鸿蒙正式版9月底见!但Mate 70的内情还得接着挖...  159
本周热议
我的信创开放社区兼职赚钱历程 40
今天你签到了吗? 27
信创开放社区邀请他人注册的具体步骤如下 15
如何玩转信创开放社区—从小白进阶到专家 15
方德桌面操作系统 14
我有15积分有什么用? 13
用抖音玩法闯信创开放社区——用平台宣传企业产品服务 13
如何让你先人一步获得悬赏问题信息?(创作者必看) 12
2024中国信创产业发展大会暨中国信息科技创新与应用博览会 9
中央国家机关政府采购中心:应当将CPU、操作系统符合安全可靠测评要求纳入采购需求 8

加入交流群

请使用微信扫一扫!