model.named_children() 使用

说明

model.named_children()
model.named_children()就是带名字的model.children(), 相比model.children(), model.named_children()不但迭代的遍历模型的子层,还会返回子层的名字:

遍历方式

In [14]: model_modules = [x for x in model.modules()]                                                                                
 
In [15]: model_named_modules = [x for x in model.named_modules()]        
 
In [16]: model_children = [x for x in model.children()]                                                                              
 
In [17]: model_named_children = [x for x in model.named_children()]                                                                  
 
In [18]: model_parameters = [x for x in model.parameters()]                                                                          
 
In [19]: model_named_parameters = [x for x in model.named_parameters()]

例子

 self.model = resnet.resnet50(downsample_factor, pretrained=pretrained)
 print("model:",self.model)
  model_children = [x for x in self.model.children()] # 获得子类的个数
  print("model.children():",len(model_children))
  # 获得子类的名字
  for x in self.model.named_children():
      model_named_children = x
      print(model_named_children)

完整的例子

import torch.nn as nn
import torch
import resnet
# https: // blog.csdn.net / anshiquanshu / article / details / 115199773

class ResNet50(nn.Module):
    def __init__(self, pretrained=False,downsample_factor=8):
        """Declare all needed layers."""
        super(ResNet50, self).__init__()
        self.model = resnet.resnet50(downsample_factor, pretrained=pretrained)
        # print("model:",self.model)
        # model_children = [x for x in self.model.children()] # 获得子类的个数
        # print("model.children():",len(model_children))
        # # 获得子类的名字
        # for x in self.model.named_children():
        #     model_named_children = x
        #     print(model_named_children)

        self.relu = self.model.relu  # Place a hook

        layers_cfg = [4, 5, 6, 7]
        self.blocks = []
        # model.children()只会遍历模型的子层
        for i, num_this_layer in enumerate(layers_cfg):
            self.blocks.append(list(self.model.children())[num_this_layer])

    def base_forward(self, x):
        feature_map = []
        x = self.model.conv1(x)
        print("x1:", x.shape)

        x = self.model.bn1(x)
        x = self.model.relu(x)
        x = self.model.maxpool(x)
        print("x2:",x.shape)

        for i, block in enumerate(self.blocks):
            x = block(x)
            feature_map.append(x)

        out = nn.AvgPool2d(x.shape[2:])(x).view(x.shape[0], -1) # 残差的最后一层
        return feature_map, out

if __name__ == '__main__':

    x = torch.rand(1, 3, 256, 256)
    rs = ResNet50(pretrained=False, downsample_factor=16) # 下采样的次数

    feature_map, out = rs.base_forward(x)
    for x in feature_map:
        print(x.shape)
    print("output:",out.shape)

参考:https://blog.csdn.net/anshiquanshu/article/details/115199773

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值