卷积神经网络模型 LeNet, AlexNet, VGG, GoogLeNet, ResNet

https://medium.com/analytics-vidhya/cnns-architectures-lenet-alexnet-vgg-googlenet-resnet-and-more-666091488df5

LeNet-5 (1998)

LeNet-5, a pioneering 7-level convolutional network by LeCun et al in 1998。

32x32 pixel greyscale inputimages。

AlexNet (2012)

 It consisted 11x11, 5x5,3x3, convolutions, max pooling, dropout, data augmentation, ReLU activations, SGD with momentum.

It attached ReLU activations after every convolutional and fully-connected layer.

ZFNet(2013)

GoogLeNet/Inception(2014) 

This module is based on several very small convolutions in order to drastically reduce the number of parameters. 

Their architecture consisted of a 22 layer deep CNN but reduced the number of parameters from 60 million (AlexNet) to 4 million.

VGGNet (2014)

 VGGNet consists of 16 convolutional layers and is very appealing because of its very uniform architecture. 

only 3x3 convolutions, but lots of filters. Trained on 4 GPUs for 2–3 weeks. ResNet(2015)

ResNet(2015) 

“skip connections” and features heavy batch normalization. 

Such skip connections are also known as gated units or gated recurrent units and have a strong similarity to recent successful elements applied in RNNs. 

152 layers while still having lower complexity than VGGNet. 

AlexNet has parallel two CNN line trained on two GPUs with cross-connections, GoogleNet has inception modules, ResNet has residual connections

 

 

  • 1
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值