VGG卷积神经网络的paddle实现

该代码定义了一个VGG网络结构,包含多个卷积层、池化层和全连接层。网络通过堆叠vgg_block来构建,每个block有多个卷积层和ReLU激活函数。最后,通过线性层进行分类,总参数数量约为134,264,641。
摘要由CSDN通过智能技术生成
# 定义网络结构
def vgg_block(num_convs, in_channels, out_channels):
    net = [nn.Conv2D(in_channels=in_channels,out_channels=out_channels,kernel_size=3,padding=1),nn.ReLU()]
    for i in range(num_convs-1):
        net.append(nn.Conv2D(out_channels=out_channels,in_channels=out_channels,kernel_size=3,stride=1,padding=1))
        net.append(nn.ReLU())
    net.append(nn.MaxPool2D(kernel_size=2))
    return nn.Sequential(*net)

def vgg_stack(num_convs,channels):
    net = []
    for n,c in zip(num_convs,channels):
        in_c = c[0]
        out_c = c[1]
        net.append(vgg_block(n,in_c,out_c))
    return nn.Sequential(*net)

class VGG(paddle.nn.Layer):
    def __init__(self,vgg_net) -> None:
        super(VGG,self).__init__()
        self.conv = vgg_stack(vgg_net[0],vgg_net[1])
        self.line = nn.Sequential(
            nn.Linear(512*7*7,4096),
            nn.ReLU(),
            nn.Dropout(0.5),
            nn.Linear(4096,4096),
            nn.ReLU(),
            nn.Dropout(0.5),
            nn.Linear(4096,1)
        )
        
    def forward(self,x):
        x = self.conv(x)
        x = paddle.flatten(x, 1, -1)
        x = self.line(x)
        return x
paddle.summary(net=VGG([[2,2,3,3,3], [[3,64],[64,128],[128,256],[256,512],[512,512]]]),input_size=(1,3,224,224))
 Layer (type)       Input Shape          Output Shape         Param #    
===========================================================================
   Conv2D-1      [[1, 3, 224, 224]]   [1, 64, 224, 224]        1,792     
    ReLU-1      [[1, 64, 224, 224]]   [1, 64, 224, 224]          0       
   Conv2D-2     [[1, 64, 224, 224]]   [1, 64, 224, 224]       36,928     
    ReLU-2      [[1, 64, 224, 224]]   [1, 64, 224, 224]          0       
  MaxPool2D-1   [[1, 64, 224, 224]]   [1, 64, 112, 112]          0       
   Conv2D-3     [[1, 64, 112, 112]]   [1, 128, 112, 112]      73,856     
    ReLU-3      [[1, 128, 112, 112]]  [1, 128, 112, 112]         0       
   Conv2D-4     [[1, 128, 112, 112]]  [1, 128, 112, 112]      147,584    
    ReLU-4      [[1, 128, 112, 112]]  [1, 128, 112, 112]         0       
  MaxPool2D-2   [[1, 128, 112, 112]]   [1, 128, 56, 56]          0       
   Conv2D-5      [[1, 128, 56, 56]]    [1, 256, 56, 56]       295,168    
    ReLU-5       [[1, 256, 56, 56]]    [1, 256, 56, 56]          0       
   Conv2D-6      [[1, 256, 56, 56]]    [1, 256, 56, 56]       590,080    
    ReLU-6       [[1, 256, 56, 56]]    [1, 256, 56, 56]          0       
   Conv2D-7      [[1, 256, 56, 56]]    [1, 256, 56, 56]       590,080    
    ReLU-7       [[1, 256, 56, 56]]    [1, 256, 56, 56]          0       
  MaxPool2D-3    [[1, 256, 56, 56]]    [1, 256, 28, 28]          0       
   Conv2D-8      [[1, 256, 28, 28]]    [1, 512, 28, 28]      1,180,160   
    ReLU-8       [[1, 512, 28, 28]]    [1, 512, 28, 28]          0       
   Conv2D-9      [[1, 512, 28, 28]]    [1, 512, 28, 28]      2,359,808   
    ReLU-9       [[1, 512, 28, 28]]    [1, 512, 28, 28]          0       
   Conv2D-10     [[1, 512, 28, 28]]    [1, 512, 28, 28]      2,359,808   
    ReLU-10      [[1, 512, 28, 28]]    [1, 512, 28, 28]          0       
  MaxPool2D-4    [[1, 512, 28, 28]]    [1, 512, 14, 14]          0       
   Conv2D-11     [[1, 512, 14, 14]]    [1, 512, 14, 14]      2,359,808   
    ReLU-11      [[1, 512, 14, 14]]    [1, 512, 14, 14]          0       
   Conv2D-12     [[1, 512, 14, 14]]    [1, 512, 14, 14]      2,359,808   
    ReLU-12      [[1, 512, 14, 14]]    [1, 512, 14, 14]          0       
   Conv2D-13     [[1, 512, 14, 14]]    [1, 512, 14, 14]      2,359,808   
    ReLU-13      [[1, 512, 14, 14]]    [1, 512, 14, 14]          0       
  MaxPool2D-5    [[1, 512, 14, 14]]     [1, 512, 7, 7]           0       
   Linear-1         [[1, 25088]]          [1, 4096]         102,764,544  
    ReLU-14         [[1, 4096]]           [1, 4096]              0       
   Dropout-1        [[1, 4096]]           [1, 4096]              0       
   Linear-2         [[1, 4096]]           [1, 4096]         16,781,312   
    ReLU-15         [[1, 4096]]           [1, 4096]              0       
   Dropout-2        [[1, 4096]]           [1, 4096]              0       
   Linear-3         [[1, 4096]]             [1, 1]             4,097     
===========================================================================
Total params: 134,264,641
Trainable params: 134,264,641
Non-trainable params: 0
---------------------------------------------------------------------------
Input size (MB): 0.57
Forward/backward pass size (MB): 218.58
Params size (MB): 512.18
Estimated Total Size (MB): 731.34
---------------------------------------------------------------------------
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值