第二次Kaggle-cifar10提交

根据上一次实验结果,这次主要做了lr的调整。把lr更新周期lr_period改为30,即每30轮lr自乘0.1,总共跑了80轮,取得了比上一次100轮更好的结果。
训练过程如下:

epoch 1, loss 2.203336, train acc 0.232400, valid acc 0.343200, time 00:19:08, lr 0.1
epoch 2, loss 1.602243, train acc 0.408400, valid acc 0.416200, time 00:06:56, lr 0.1
epoch 3, loss 1.383010, train acc 0.496811, valid acc 0.502400, time 00:05:54, lr 0.1
epoch 4, loss 1.198685, train acc 0.570396, valid acc 0.549600, time 00:05:42, lr 0.1
epoch 5, loss 1.017291, train acc 0.638795, valid acc 0.559800, time 00:03:41, lr 0.1
epoch 6, loss 0.848029, train acc 0.703396, valid acc 0.604600, time 00:02:34, lr 0.1
epoch 7, loss 0.734469, train acc 0.744974, valid acc 0.715800, time 00:02:06, lr 0.1
epoch 8, loss 0.672550, train acc 0.767013, valid acc 0.718200, time 00:01:33, lr 0.1
epoch 9, loss 0.631891, train acc 0.779381, valid acc 0.678200, time 00:01:18, lr 0.1
epoch 10, loss 0.587453, train acc 0.797852, valid acc 0.709600, time 00:01:25, lr 0.1
epoch 11, loss 0.560288, train acc 0.807876, valid acc 0.695400, time 00:01:42, lr 0.1
epoch 12, loss 0.534758, train acc 0.815565, valid acc 0.778000, time 00:01:48, lr 0.1
epoch 13, loss 0.524292, train acc 0.817706, valid acc 0.757600, time 00:01:37, lr 0.1
epoch 14, loss 0.507128, train acc 0.824754, valid acc 0.706800, time 00:01:15, lr 0.1
epoch 15, loss 0.489780, train acc 0.831289, valid acc 0.739000, time 00:01:12, lr 0.1
epoch 16, loss 0.478400, train acc 0.834487, valid acc 0.747800, time 00:01:11, lr 0.1
epoch 17, loss 0.464805, train acc 0.840971, valid acc 0.809400, time 00:01:12, lr 0.1
epoch 18, loss 0.452167, train acc 0.845787, valid acc 0.639600, time 00:01:11, lr 0.1
epoch 19, loss 0.451828, train acc 0.842865, valid acc 0.773000, time 00:01:11, lr 0.1
epoch 20, loss 0.434691, train acc 0.848778, valid acc 0.786400, time 00:01:11, lr 0.1
epoch 21, loss 0.426287, train acc 0.853385, valid acc 0.784400, time 00:01:11, lr 0.1
epoch 22, loss 0.432857, train acc 0.851538, valid acc 0.683200, time 00:01:12, lr 0.1
epoch 23, loss 0.421968, train acc 0.855353, valid acc 0.770800, time 00:01:14, lr 0.1
epoch 24, loss 0.418132, train acc 0.857629, valid acc 0.809400, time 00:01:12, lr 0.1
epoch 25, loss 0.406591, train acc 0.861239, valid acc 0.808200, time 00:01:11, lr 0.1
epoch 26, loss 0.400529, train acc 0.861972, valid acc 0.794400, time 00:01:11, lr 0.1
epoch 27, loss 0.401173, train acc 0.861750, valid acc 0.777200, time 00:01:14, lr 0.1
epoch 28, loss 0.397533, train acc 0.862583, valid acc 0.752800, time 00:01:11, lr 0.1
epoch 29, loss 0.385643, train acc 0.867461, valid acc 0.790000, time 00:01:10, lr 0.1
epoch 30, loss 0.393749, train acc 0.862204, valid acc 0.802000, time 00:01:10, lr 0.1

30轮过后,lr调整为原来的0.1,acc有明显提升。
epoch 31, loss 0.239432, train acc 0.919860, valid acc 0.881800, time 00:01:19, lr 0.010000000000000002
epoch 32, loss 0.178016, train acc 0.939436, valid acc 0.888000, time 00:01:11, lr 0.010000000000000002
epoch 33, loss 0.154890, train acc 0.948509, valid acc 0.900200, time 00:01:09, lr 0.010000000000000002
epoch 34, loss 0.135379, train acc 0.953608, valid acc 0.891800, time 00:01:10, lr 0.010000000000000002
epoch 35, loss 0.123970, train acc 0.958859, valid acc 0.894000, time 00:01:11, lr 0.010000000000000002
epoch 36, loss 0.112327, train acc 0.962989, valid acc 0.894000, time 00:01:11, lr 0.010000000000000002
epoch 37, loss 0.103455, train acc 0.965418, valid acc 0.895400, time 00:01:11, lr 0.010000000000000002
epoch 38, loss 0.091813, train acc 0.969527, valid acc 0.893200, time 00:01:10, lr 0.010000000000000002
epoch 39, loss 0.084983, train acc 0.972227, valid acc 0.905800, time 00:01:34, lr 0.010000000000000002
epoch 40, loss 0.078403, train acc 0.974415, valid acc 0.897600, time 00:01:22, lr 0.010000000000000002
epoch 41, loss 0.072169, train acc 0.975768, valid acc 0.899200, time 00:01:17, lr 0.010000000000000002
epoch 42, loss 0.071349, train acc 0.976733, valid acc 0.894000, time 00:01:16, lr 0.010000000000000002
epoch 43, loss 0.064847, train acc 0.977988, valid acc 0.889400, time 00:01:14, lr 0.010000000000000002
epoch 44, loss 0.062040, train acc 0.979751, valid acc 0.899800, time 00:01:14, lr 0.010000000000000002
epoch 45, loss 0.060873, train acc 0.979897, valid acc 0.906400, time 00:01:14, lr 0.010000000000000002
epoch 46, loss 0.060420, train acc 0.981006, valid acc 0.889200, time 00:01:14, lr 0.010000000000000002
epoch 47, loss 0.053726, train acc 0.982787, valid acc 0.896400, time 00:01:13, lr 0.010000000000000002
epoch 48, loss 0.053420, train acc 0.982893, valid acc 0.899200, time 00:01:16, lr 0.010000000000000002
epoch 49, loss 0.055049, train acc 0.981805, valid acc 0.896400, time 00:01:16, lr 0.010000000000000002
epoch 50, loss 0.052733, train acc 0.983001, valid acc 0.895400, time 00:01:14, lr 0.010000000000000002
epoch 51, loss 0.051988, train acc 0.982982, valid acc 0.889600, time 00:01:15, lr 0.010000000000000002
epoch 52, loss 0.048979, train acc 0.983786, valid acc 0.904000, time 00:01:13, lr 0.010000000000000002
epoch 53, loss 0.047615, train acc 0.983975, valid acc 0.891600, time 00:01:18, lr 0.010000000000000002
epoch 54, loss 0.046565, train acc 0.985263, valid acc 0.892400, time 00:01:15, lr 0.010000000000000002
epoch 55, loss 0.047992, train acc 0.984434, valid acc 0.890200, time 00:01:24, lr 0.010000000000000002
epoch 56, loss 0.045407, train acc 0.985956, valid acc 0.875400, time 00:01:26, lr 0.010000000000000002
epoch 57, loss 0.050706, train acc 0.983586, valid acc 0.899800, time 00:04:05, lr 0.010000000000000002
epoch 58, loss 0.048481, train acc 0.984247, valid acc 0.890800, time 00:07:26, lr 0.010000000000000002
epoch 59, loss 0.047978, train acc 0.984313, valid acc 0.885800, time 00:12:23, lr 0.010000000000000002
epoch 60, loss 0.049937, train acc 0.983820, valid acc 0.895000, time 00:08:38, lr 0.010000000000000002

60轮时,再次讲学习率缩小10倍,valid acc从0.89提升到了0.91左右。
epoch 61, loss 0.036304, train acc 0.988663, valid acc 0.907000, time 00:05:09, lr 0.0010000000000000002
epoch 62, loss 0.025534, train acc 0.992565, valid acc 0.910200, time 00:04:32, lr 0.0010000000000000002
epoch 63, loss 0.023373, train acc 0.993630, valid acc 0.910000, time 00:04:23, lr 0.0010000000000000002
epoch 64, loss 0.020479, train acc 0.994257, valid acc 0.913200, time 00:03:51, lr 0.0010000000000000002
epoch 65, loss 0.018151, train acc 0.995455, valid acc 0.913600, time 00:03:47, lr 0.0010000000000000002
epoch 66, loss 0.017228, train acc 0.995605, valid acc 0.914000, time 00:04:30, lr 0.0010000000000000002
epoch 67, loss 0.017362, train acc 0.995482, valid acc 0.912600, time 00:05:08, lr 0.0010000000000000002
epoch 68, loss 0.016431, train acc 0.996027, valid acc 0.912800, time 00:04:14, lr 0.0010000000000000002
epoch 69, loss 0.016355, train acc 0.995783, valid acc 0.913800, time 00:04:44, lr 0.0010000000000000002
epoch 70, loss 0.015025, train acc 0.996037, valid acc 0.915600, time 00:04:01, lr 0.0010000000000000002
epoch 71, loss 0.014929, train acc 0.996360, valid acc 0.914400, time 00:01:34, lr 0.0010000000000000002
epoch 72, loss 0.013791, train acc 0.996804, valid acc 0.917200, time 00:01:37, lr 0.0010000000000000002
epoch 73, loss 0.013601, train acc 0.996782, valid acc 0.912200, time 00:01:42, lr 0.0010000000000000002
epoch 74, loss 0.013637, train acc 0.996515, valid acc 0.914000, time 00:02:28, lr 0.0010000000000000002
epoch 75, loss 0.013018, train acc 0.996853, valid acc 0.915000, time 00:04:16, lr 0.0010000000000000002
epoch 76, loss 0.012425, train acc 0.997004, valid acc 0.914000, time 00:04:32, lr 0.0010000000000000002
epoch 77, loss 0.012557, train acc 0.997004, valid acc 0.913800, time 00:04:11, lr 0.0010000000000000002
epoch 78, loss 0.012041, train acc 0.997425, valid acc 0.913000, time 00:03:44, lr 0.0010000000000000002
epoch 79, loss 0.012500, train acc 0.996693, valid acc 0.915600, time 00:04:25, lr 0.0010000000000000002
epoch 80, loss 0.010953, train acc 0.997869, valid acc 0.916800, time 00:01:52, lr 0.0010000000000000002

之后valid acc爬升缓慢,而且train acc明显反映出已经overfitting。
这次调整通过减小lr_period来使得模型更快收敛,节省了不少时间,但对于准确率的提高比较有限。要进一步提升分数,还需从其他方面入手考虑调参方案。

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值