nn.Sequential()方法中访问中间层

一、

使用nn.Sequential按顺序构造所有层,在forward函数中直接调用

nn.Sequential的特点:将容器视为单个模块,即一个模块可以包含许多层

nn.Sequential有三种常见定义模型的方式:

① 基本实现方式:顺序定义每一层,特点:每一层没有名字,仅能通过下标访问各层
 

import torch
import torch.nn as nn
 
class Net(nn.Module):
    def __init__(self, n_feature, n_hidden, n_output):
        super(Net,self).__init__()
        self.net_1 = nn.Sequential(
            nn.Linear(n_feature, n_hidden),
            nn.ReLU(),
            nn.Linear(n_hidden, n_output)
        )
    
    def forward(self,x):
        x = self.net_1(x)
        return x
 
model_2 = Net(1,10,1)
print(model_2)
 
 
'''运行结果为:
Net(
  (net_1): Sequential(
    (0): Linear(in_features=1, out_features=10, bias=True)
    (1): ReLU()
    (2): Linear(in_features=10, out_features=1, bias=True)
  )
)
'''

② 给每一层自定义名称

import torch.nn as nn
from collections import OrderedDict
 
 
model = nn.Sequential(OrderedDict([
     ('conv1', nn.Conv2d(1, 20, 5)),
     ('relu1', nn.ReLU()),
     ('conv2', nn.Conv2d(20, 64, 5)),
     ('relu2', nn.ReLU())
 ]))
 
 print(model)
 print(model[2])  # 通过索引获取第几个层
 print(model.conv1)
 
 '''运行结果为:
 Sequential(
   (conv1): Conv2d(1, 20, kernel_size=(5, 5), stride=(1, 1))
   (relu1): ReLU()
   (conv2): Conv2d(20, 64, kernel_size=(5, 5), stride=(1, 1))
   (relu2): ReLU()
 )
 Conv2d(20, 64, kernel_size=(5, 5), stride=(1, 1))
 Conv2d(1, 20, kernel_size=(5, 5), stride=(1, 1))
 '''

③ 使用add_module方法逐层加入Sequential中,该方法是从nn.Module类继承而来,nn.Sequental本身没有该方法,可通过自定义名称访问。

 1 import torch.nn as nn
 2 from collections import OrderedDict
 3 
 
 4 model = nn.Sequential()
 5 model.add_module("conv1", nn.Conv2d(1, 20, 5))
 6 model.add_module('relu1', nn.ReLU())
 7 model.add_module('conv2', nn.Conv2d(20, 64, 5))
 8 model.add_module('relu2', nn.ReLU())
 9 
10 print(model)
11 print(model[2])  # 通过索引获取第几个层
12 print(model.conv1)
 
 
13 '''运行结果为:
14 Sequential(
15   (conv1): Conv2d(1, 20, kernel_size=(5, 5), stride=(1, 1))
16   (relu1): ReLU()
17   (conv2): Conv2d(20, 64, kernel_size=(5, 5), stride=(1, 1))
18   (relu2): ReLU()
19 )
20 Conv2d(20, 64, kernel_size=(5, 5), stride=(1, 1))
21 Conv2d(1, 20, kernel_size=(5, 5), stride=(1, 1))
22 '''

二、

使用双层nn.Sequential()则可以采用双重索引的形式访问某一层。

       def conv_bn(inp, oup, stride):
            return nn.Sequential(
                nn.Conv2d(inp, oup, 3, stride, 1, bias=False),
                nn.BatchNorm2d(oup),
                nn.ReLU(inplace=True)
            )

        def conv_dw(inp, oup, stride):
            return nn.Sequential(
                nn.Conv2d(inp, inp, 3, stride, 1, groups=inp, bias=False),
                nn.BatchNorm2d(inp),
                nn.ReLU(inplace=True),

                nn.Conv2d(inp, oup, 1, 1, 0, bias=False),
                nn.BatchNorm2d(oup),
                nn.ReLU(inplace=True),
            )

        self.model = nn.Sequential(
            conv_bn(3, 32, 2),
            conv_dw(32, 64, 1),
            conv_dw(64, 128, 2),
            conv_dw(128, 128, 1),
            conv_dw(128, 256, 2),
            conv_dw(256, 256, 1),
            conv_dw(256, 512, 2),
            conv_dw(512, 512, 1),
            conv_dw(512, 512, 1),
            conv_dw(512, 512, 1),
            conv_dw(512, 512, 1),
            conv_dw(512, 512, 1),
            conv_dw(512, 1024, 2),
            conv_dw(1024, 1024, 1),
            nn.AvgPool2d(7),
        )
        self.fc = nn.Linear(1024, 1000)

    def forward(self, x):
        x = self.model(x)
        x = x.view(-1, 1024)
        x = self.fc(x)
        return x

    def get_bn_before_relu(self):
        bn1 = self.model[3][-2]
        bn2 = self.model[5][-2]
        bn3 = self.model[11][-2]
        bn4 = self.model[13][-2]
#  self.model[3][-2]即是访问第一个nn.Sequential()中的第4层,第二个nn.Sequential()中的倒数第二层

  • 3
    点赞
  • 10
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值