torchsummary的用法

torchsummary的用法

summary可以打印出网络结构和参数

from torchsummary import summary
from torchvision.models import resnet18

model = resnet18()
summary(model, input_size=[(3, 256, 256)], batch_size=2, device="cpu")

结果:

----------------------------------------------------------------
        Layer (type)               Output Shape         Param #
================================================================
            Conv2d-1          [2, 64, 128, 128]           9,408
       BatchNorm2d-2          [2, 64, 128, 128]             128
              ReLU-3          [2, 64, 128, 128]               0
         MaxPool2d-4            [2, 64, 64, 64]               0
            Conv2d-5            [2, 64, 64, 64]          36,864
       BatchNorm2d-6            [2, 64, 64, 64]             128
              ReLU-7            [2, 64, 64, 64]               0
            Conv2d-8            [2, 64, 64, 64]          36,864
       BatchNorm2d-9            [2, 64, 64, 64]             128
             ReLU-10            [2, 64, 64, 64]               0
       BasicBlock-11            [2, 64, 64, 64]               0
           Conv2d-12            [2, 64, 64, 64]          36,864
      BatchNorm2d-13            [2, 64, 64, 64]             128
             ReLU-14            [2, 64, 64, 64]               0
           Conv2d-15            [2, 64, 64, 64]          36,864
      BatchNorm2d-16            [2, 64, 64, 64]             128
             ReLU-17            [2, 64, 64, 64]               0
       BasicBlock-18            [2, 64, 64, 64]               0
           Conv2d-19           [2, 128, 32, 32]          73,728
      BatchNorm2d-20           [2, 128, 32, 32]             256
             ReLU-21           [2, 128, 32, 32]               0
           Conv2d-22           [2, 128, 32, 32]         147,456
      BatchNorm2d-23           [2, 128, 32, 32]             256
           Conv2d-24           [2, 128, 32, 32]           8,192
      BatchNorm2d-25           [2, 128, 32, 32]             256
             ReLU-26           [2, 128, 32, 32]               0
       BasicBlock-27           [2, 128, 32, 32]               0
           Conv2d-28           [2, 128, 32, 32]         147,456
      BatchNorm2d-29           [2, 128, 32, 32]             256
             ReLU-30           [2, 128, 32, 32]               0
           Conv2d-31           [2, 128, 32, 32]         147,456
      BatchNorm2d-32           [2, 128, 32, 32]             256
             ReLU-33           [2, 128, 32, 32]               0
       BasicBlock-34           [2, 128, 32, 32]               0
           Conv2d-35           [2, 256, 16, 16]         294,912
      BatchNorm2d-36           [2, 256, 16, 16]             512
             ReLU-37           [2, 256, 16, 16]               0
           Conv2d-38           [2, 256, 16, 16]         589,824
      BatchNorm2d-39           [2, 256, 16, 16]             512
           Conv2d-40           [2, 256, 16, 16]          32,768
      BatchNorm2d-41           [2, 256, 16, 16]             512
             ReLU-42           [2, 256, 16, 16]               0
       BasicBlock-43           [2, 256, 16, 16]               0
           Conv2d-44           [2, 256, 16, 16]         589,824
      BatchNorm2d-45           [2, 256, 16, 16]             512
             ReLU-46           [2, 256, 16, 16]               0
           Conv2d-47           [2, 256, 16, 16]         589,824
      BatchNorm2d-48           [2, 256, 16, 16]             512
             ReLU-49           [2, 256, 16, 16]               0
       BasicBlock-50           [2, 256, 16, 16]               0
           Conv2d-51             [2, 512, 8, 8]       1,179,648
      BatchNorm2d-52             [2, 512, 8, 8]           1,024
             ReLU-53             [2, 512, 8, 8]               0
           Conv2d-54             [2, 512, 8, 8]       2,359,296
      BatchNorm2d-55             [2, 512, 8, 8]           1,024
           Conv2d-56             [2, 512, 8, 8]         131,072
      BatchNorm2d-57             [2, 512, 8, 8]           1,024
             ReLU-58             [2, 512, 8, 8]               0
       BasicBlock-59             [2, 512, 8, 8]               0
           Conv2d-60             [2, 512, 8, 8]       2,359,296
      BatchNorm2d-61             [2, 512, 8, 8]           1,024
             ReLU-62             [2, 512, 8, 8]               0
           Conv2d-63             [2, 512, 8, 8]       2,359,296
      BatchNorm2d-64             [2, 512, 8, 8]           1,024
             ReLU-65             [2, 512, 8, 8]               0
       BasicBlock-66             [2, 512, 8, 8]               0
AdaptiveAvgPool2d-67             [2, 512, 1, 1]               0
           Linear-68                  [2, 1000]         513,000
================================================================
Total params: 11,689,512
Trainable params: 11,689,512
Non-trainable params: 0
----------------------------------------------------------------
Input size (MB): 1.50
Forward/backward pass size (MB): 164.02
Params size (MB): 44.59
Estimated Total Size (MB): 210.12
----------------------------------------------------------------
评论 4
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值