分类网络模型论文重点概括摘抄及结构图片(AlexNet, Inception, ResNet, Inception-ResNet, DPN, SENet, PolyNet, NASNet-A)

模型:
AlexNet
VGG
GoogLenet or Inception-v1
Inception-v2
Inception-v3
Inception-v4
Inception-ResNet-v1
Inception-ResNet-v2
ResNet-v1
ResNet-v2
ResNeXt
SENet
DenseNet
DPN
PolyNet
NASNet-A
各模型使用感受:后期比较复杂的模型大部分均为基于Inception,ResNet,DenseNet(相对不太常用)的思想的ensemble。DenseNet在keras下内存消耗较大,训练速度较慢,在实战中取得的分类准确率并不理想,所以不太使用。
AlexNet
在这里插入图片描述
VGG
compelling feature of architectural simplicity
high cost
在这里插入图片描述
GoogLenet or Inception-v1
(https://arxiv.org/pdf/1409.4842.pdf)
auxiliary classifiers to improve the convergence of very deep network
the losses of the auxiliary classifiers were weighted by 0.3
在这里插入图片描述
Inception-v2 or BN-Inception
batch normalization
Replace 55 convolution with two 33 convolutions
exploring ways to scale up networks in ways that aim at utilizing the added computation as efficiently as possible by suitably factorized convolutions and aggressive regularization.
Inception-v3
(https://arxiv.org/pdf/1512.00567.pdf)
Accuracy: Inception v3 ≈ ResNet, Benchmark: 2015 ILSVRC challenge
There is also some evidence of residual Inception networks outperforming similarly expensive Inception networks without residual connections by a thin margin.
Key point: factorization of big size convolutions
employing this factorization does not work well on early layers, but it gives very good results on medium grid-sizes (On m×m feature maps, where m ranges between 12 and 20).

Avoid representational bottleneck: In general the representation size should gently decrease from the inputs to the outputs before reaching the final representation
Solution: figure 7(https://baijiahao.baidu.com/s?id=1601882944953788623&wfr=spider&for=pc)
在这里插入图片描述

  • 1
    点赞
  • 12
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值