DL之DNN:利用MultiLayerNetExtend模型【6*100+ReLU+SGD,dropout】对Mnist数据集训练来抑制过拟合


睫毛膏俊秀
睫毛膏俊秀 2022-09-19 15:06:58 50276
分类专栏: 资讯

DL之DNN:利用MultiLayerNetExtend模型【6*100+ReLU+SGD,dropout】对Mnist数据集训练来抑制过拟合

目录

输出结果

设计思路

核心代码

更多输出


输出结果

设计思路

190417更新

核心代码

  1. class RMSprop:
  2. def __init__(self, lr=0.01, decay_rate = 0.99):
  3. self.lr = lr
  4. self.decay_rate = decay_rate
  5. self.h = None
  6. def update(self, params, grads):
  7. if self.h is None:
  8. self.h = {}
  9. for key, val in params.items():
  10. self.h[key] = np.zeros_like(val)
  11. for key in params.keys():
  12. self.h[key] *= self.decay_rate
  13. self.h[key] += (1 - self.decay_rate) * grads[key] * grads[key]
  14. params[key] -= self.lr * grads[key] / (np.sqrt(self.h[key]) + 1e-7)
  15. class Nesterov:
  16. def __init__(self, lr=0.01, momentum=0.9):
  17. self.lr = lr
  18. self.momentum = momentum
  19. self.v = None
  20. def update(self, params, grads):
  21. if self.v is None:
  22. self.v = {}
  23. for key, val in params.items():
  24. self.v[key] = np.zeros_like(val)
  25. for key in params.keys():
  26. self.v[key] *= self.momentum
  27. self.v[key] -= self.lr * grads[key]
  28. params[key] += self.momentum * self.momentum * self.v[key]
  29. params[key] -= (1 + self.momentum) * self.lr * grads[key]
  30. use_dropout = True
  31. dropout_ratio = 0.2
  32. network = MultiLayerNetExtend(input_size=784, hidden_size_list=[100, 100, 100, 100, 100, 100],
  33. output_size=10, use_dropout=use_dropout, dropout_ration=dropout_ratio)
  34. trainer = Trainer(network, x_train, t_train, x_test, t_test, epochs=301, mini_batch_size=100,
  35. optimizer='sgd', optimizer_param={'lr': 0.01}, verbose=True)
  36. trainer.train()
  37. train_acc_list, test_acc_list = trainer.train_acc_list, trainer.test_acc_list

更多输出

1、DNN[6*100+ReLU,SGD]: accuracy of not dropout on Minist dataset

  1. train loss:2.3364575765992637
  2. === epoch:1, train acc:0.10333333333333333, test acc:0.1088 ===
  3. train loss:2.414526554119518
  4. train loss:2.341182306768928
  5. train loss:2.3072782723352496
  6. === epoch:2, train acc:0.09666666666666666, test acc:0.1103 ===
  7. train loss:2.2600377181768887
  8. train loss:2.263350960525319
  9. train loss:2.2708260374887645
  10. ……
  11. === epoch:298, train acc:1.0, test acc:0.7709 ===
  12. train loss:0.00755416896470134
  13. train loss:0.009934657874546435
  14. train loss:0.008421672959852643
  15. === epoch:299, train acc:1.0, test acc:0.7712 ===
  16. train loss:0.007142981215285884
  17. train loss:0.008205245499586114
  18. train loss:0.007319626293763803
  19. === epoch:300, train acc:1.0, test acc:0.7707 ===
  20. train loss:0.00752230499930163
  21. train loss:0.008431046288276818
  22. train loss:0.008067532729014863
  23. === epoch:301, train acc:1.0, test acc:0.7707 ===
  24. train loss:0.010729407851274233
  25. train loss:0.007776889701033221
  26. =============== Final Test Accuracy ===============
  27. test acc:0.771

2、DNN[6*100+ReLU,SGD]: accuracy of dropout(0.2) on Minist dataset

  1. train loss:2.3064018541384437
  2. === epoch:1, train acc:0.11, test acc:0.1112 ===
  3. train loss:2.316626942558816
  4. train loss:2.314434337198633
  5. train loss:2.318862771955365
  6. === epoch:2, train acc:0.11333333333333333, test acc:0.1128 ===
  7. train loss:2.3241989320140717
  8. train loss:2.317694982413387
  9. train loss:2.3079716553885006
  10. ……
  11. === epoch:298, train acc:0.6266666666666667, test acc:0.5168 ===
  12. train loss:1.2359381134877185
  13. train loss:1.2833380447791383
  14. train loss:1.2728131428100005
  15. === epoch:299, train acc:0.63, test acc:0.52 ===
  16. train loss:1.1687601000183936
  17. train loss:1.1435412548991142
  18. train loss:1.3854277174616834
  19. === epoch:300, train acc:0.6333333333333333, test acc:0.5244 ===
  20. train loss:1.3039470016588997
  21. train loss:1.2359979876607923
  22. train loss:1.2871396654831204
  23. === epoch:301, train acc:0.63, test acc:0.5257 ===
  24. train loss:1.1690084424502523
  25. train loss:1.1820777530873694
  26. =============== Final Test Accuracy ===============
  27. test acc:0.5269

相关文章
CSDN:2019.04.09起

网站声明:如果转载,请联系本站管理员。否则一切后果自行承担。

本文链接:https://www.xckfsq.com/news/show.html?id=3175
赞同 0
评论 0 条
睫毛膏俊秀L0
粉丝 0 发表 8 + 关注 私信
上周热门
如何使用 StarRocks 管理和优化数据湖中的数据?  2959
【软件正版化】软件正版化工作要点  2878
统信UOS试玩黑神话:悟空  2843
信刻光盘安全隔离与信息交换系统  2737
镜舟科技与中启乘数科技达成战略合作,共筑数据服务新生态  1270
grub引导程序无法找到指定设备和分区  1235
华为全联接大会2024丨软通动力分论坛精彩议程抢先看!  165
点击报名 | 京东2025校招进校行程预告  164
2024海洋能源产业融合发展论坛暨博览会同期活动-海洋能源与数字化智能化论坛成功举办  163
华为纯血鸿蒙正式版9月底见!但Mate 70的内情还得接着挖...  159
本周热议
我的信创开放社区兼职赚钱历程 40
今天你签到了吗? 27
信创开放社区邀请他人注册的具体步骤如下 15
如何玩转信创开放社区—从小白进阶到专家 15
方德桌面操作系统 14
我有15积分有什么用? 13
用抖音玩法闯信创开放社区——用平台宣传企业产品服务 13
如何让你先人一步获得悬赏问题信息?(创作者必看) 12
2024中国信创产业发展大会暨中国信息科技创新与应用博览会 9
中央国家机关政府采购中心:应当将CPU、操作系统符合安全可靠测评要求纳入采购需求 8

加入交流群

请使用微信扫一扫!