Resnet BN

【深度学习】深入理解Batch Normalization批标准化

https://www.zhihu.com/topic/20084849/hot

resnet(残差网络)的F(x)究竟长什么样子?

https://www.zhihu.com/question/53224378

如何理解微软的深度残差学习?

https://www.zhihu.com/question/38499534?sort=created

 

SKIP CONNECTIONS ELIMINATE SINGULARITIES

https://arxiv.org/pdf/1701.09175.pdf

 详解残差网络

https://zhuanlan.zhihu.com/p/42706477

 

残差网络原理

https://blog.csdn.net/qq_30478885/article/details/78828734

https://www.coursera.org/lecture/convolutional-neural-networks/why-resnets-work-XAKNO

https://arxiv.org/pdf/1512.03385.pdf

https://www.quora.com/How-does-deep-residual-learning-work

https://arxiv.org/pdf/1603.05027.pdf

Resnet中残差块的作用是完成恒等变换,那这样的恒等变换的意义是什么,在网络中能起到怎样的作用呢?

https://www.zhihu.com/question/293243905

https://zhuanlan.zhihu.com/p/28124810 

https://arxiv.org/pdf/1502.03167v3.pdf

https://zhuanlan.zhihu.com/p/31645196

https://www.coursera.org/lecture/convolutional-neural-networks/why-resnets-work-XAKNO

https://arxiv.org/pdf/1506.01497v3.pdf

https://arxiv.org/pdf/1504.08083.pdf

https://arxiv.org/pdf/1311.2524v5.pdf

https://arxiv.org/pdf/1702.08591.pdf

https://arxiv.org/pdf/1611.05431.pdf

https://arxiv.org/pdf/1607.07032.pdf

 

https://arxiv.org/abs/1605.06431

Residual Networks 理解

协方差

https://www.zhihu.com/question/20852004

ResNet架构可逆!多大等提出性能优越的可逆残差网络

一文简述ResNet及其多种变体

 

TensorFlow 实现 Resnet V2 代码解读

Identity Mapping in ResNet

1. 学习吴恩达在coursera的“深度学习课程”中关于残差网络的内容
2. 读该模型的原版论文:Deep Residual Learning for Image Recognition,如果阅读有难度,可以参考网络上的翻译稿,这里有一篇笔者的翻译稿供参考。
3. 注册github,用于查看和下载残差网络的开源源码。注册地址
4. 复制源代码到本地。源码地址在此

 

【1】He K, Zhang X, Ren S, et al. Deep residual learning for image recognition[C]//Proceedings of the IEEE conference on computer vision and pattern recognition. 2016: 770-778.

【2】Srivastava R K, Greff K, Schmidhuber J. Highway networks[J]. arXiv preprint arXiv:1505.00387, 2015.

【3】Orhan A E, Pitkow X. Skip connections eliminate singularities[J]. arXiv preprint arXiv:1701.09175, 2017.

【4】Shang W, Sohn K, Almeida D, et al. Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units[J]. 2016:2217-2225.

【5】Greff K, Srivastava R K, Schmidhuber J. Highway and Residual Networks learn Unrolled Iterative Estimation[J]. 2017.

【6】Jastrzebski S, Arpit D, Ballas N, et al. Residual connections encourage iterative inference[J]. arXiv preprint arXiv:1710.04773, 2017.

 

转载于:https://www.cnblogs.com/WCFGROUP/p/8999877.html

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值