-
梯度下降
(1)mini-batch SGD 小批量随机梯度下降
(例子写的很清楚,感觉好理解) -
CNN:
pooling目的是为了保持某种不变性(旋转、平移、伸缩等),虽然池化的结果是使得特征减少,参数减少,但是鲁棒性也提高了,我是这么理解的
通道解释的挺好的:
https://blog.csdn.net/sinat_35821976/article/details/81503953
为什么卷积/池化?、CNN的反向传播【还有实例讲解】
https://yq.aliyun.com/articles/637953
吴恩达:(讲的很好,尤其是全连接的地方)
https://blog.csdn.net/ice_actor/article/details/78648780
CNN卷积神经网络各层的参数和链接个数的计算:
https://blog.csdn.net/dcxhun3/article/details/46878999
tf.nn.max_pool参数
https://blog.csdn.net/coder_xiaohui/article/details/78025379
le-net:
https://blog.csdn.net/qianqing13579/article/details/71076261
Alexnet:提出relu和dropout
https://blog.csdn.net/chaipp0607/article/details/72847422
http://noahsnail.com/2017/07/18/2017-07-18-AlexNet%E8%AE%BA%E6%96%87%E7%BF%BB%E8%AF%91%E2%80%94%E2%80%94%E4%B8%AD%E6%96%87%E7%89%88/
更多翻译论文:https://github.com/SnailTyan/deep-learning-papers-translation
激活函数的优缺点:https://blog.csdn.net/u011684265/article/details/78039280
https://www.jianshu.com/p/89956fbb7098
Alexnet相比与lenet有哪些变化?
https://www.cnblogs.com/qw12/p/8470730.html
CNN架构变化:(下面这个总结的很好,几乎包括了所有网络结构)
https://www.cnblogs.com/guoyaohua/p/8534077.html
Inception:
http://baijiahao.baidu.com/s?id=1601882944953788623&wfr=spider&for=pc
resnet:
https://baijiahao.baidu.com/s?id=1609100487339160987&wfr=spider&for=pc
为什么需要0均值化?加快反向传播训练速度,不过还是没太懂。 -
CNN-text:
用CNN做句子分类:
https://www.jianshu.com/p/594d1984fbd9
http://www.wildml.com/2015/12/implementing-a-cnn-for-text-classification-in-tensorflow/ (有完整数据和TensorFlow代码)
借助理解图:http://www.wildml.com/2015/11/understanding-convolutional-neural-networks-for-nlp/
textCNN和RNN
ABCNN用于句子分类(attention based cnn)
Attention-based CNN:
https://cloud.tencent.com/developer/news/336598
http://www.cnblogs.com/robert-dlut/p/5952032.html
http://www.sohu.com/a/132281781_505880
https://blog.csdn.net/tcx1992/article/details/83344272 -
RNN
https://www.cnblogs.com/pinard/p/6509630.html
https://www.cnblogs.com/pinard/p/6437495.html
LSTM
https://www.cnblogs.com/jiangxinyang/p/9362922.html
RNN例子:
https://segmentfault.com/a/1190000014256396?utm_medium=referral&utm_source=tuicool
seq2seq
https://baijiahao.baidu.com/s?id=1597817868542771682&wfr=spider&for=pc&isFailFlag=1
深度学习过拟合问题
https://www.cnblogs.com/eilearn/p/9203186.html
nlp数据增强:https://blog.csdn.net/Adupanfei/article/details/84956566
5、过拟合
dropout:失活神经元
https://blog.csdn.net/program_developer/article/details/80737724