dropout、dropconnect、maxout、batch normalization

This success of CNNs is partly due to the availability of large datasets and high-performance computing systems and partly due to the recent technical advances on learning methods and regularization techniques like dropout [27], dropconnect [31], maxout [5] and batch normalization [8].

[27] N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever,
and R. Salakhutdinov. Dropout: A simple way to prevent
neural networks from overfitting. The Journal of
Machine Learning Research, 15(1):1929–1958, 2014.

[31] L.Wan, M. Zeiler, S. Zhang, Y. L. Cun, and R. Fergus.
Regularization of neural networks using dropconnect.
In Proceedings of the 30th International Conference
on Machine Learning (ICML-13), pages 1058–1066,
2013. 1, 2

[5] I. J. Goodfellow, D. Warde-Farley, M. Mirza,
A. Courville, and Y. Bengio. Maxout networks. arXiv
preprint arXiv:1302.4389, 2013. 1, 2, 5, 6

[8] S. Ioffe and C. Szegedy. Batch normalization: Accelerating
deep network training by reducing internal covariate
shift. arXiv preprint arXiv:1502.03167, 2015.
1, 2

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值