确认可跑的resnet18代码(pytorch)

网上对resnet18的实现很多,代码的布局和细微处都有差别,很多初学者看得眼花缭乱,而且浪费大量时间后代码可能还跑不起来。

我下面这个代码原本也是从网上东拼西凑的,但是跑起来发现有很多错误。经过我的修改后确认已经没问题了,可以直接用:

import torch
from torch import nn
from torch.nn import functional as F

#定义resnet的基础结构残差模块
class ResBlock(nn.Module):
    def __init__(self,inchannel,outchannel,stride=1):
        super(ResBlock,self).__init__()
        #首先定义残差模块要用到的
        self.conv1 = nn.Conv2d(inchannel,outchannel,kernel_size=3,stride=stride,padding=1,bias=False)
        self.bn1 = nn.BatchNorm2d(outchannel)
        self.relu = nn.ReLU(inplace=True)
        self.conv2 = nn.Conv2d(outchannel,outchannel,kernel_size=3,stride=1,padding=1,bias=False)
        self.bn2 = nn.BatchNorm2d(outchannel)
 
        self.shortcut = nn.Sequential() #如果不需要做下采样,则这一步什么都不做

        if stride!=1 or inchannel != outchannel:
            self.shortcut = nn.Sequential(
                nn.Conv2d(inchannel,outchannel,kernel_size=1,stride=stride,bias=False),
                nn.BatchNorm2d(outchannel))


    def forward(self,x):
        #print(x.size())
        out = self.conv1(x)
        #print(out.size())
        out = self.bn1(out)
        out = self.relu(out)
        out = self.conv2(out)
        out = self.bn2(out)
        #print(out.size())
        out = out+self.shortcut(x)

        out = self.relu(out)

        return out

class ResNet18(nn.Module):
    def __init__(self,ResBlock,num_classes=10):
        super(ResNet18,self).__init__()
        self.inchannel = 64
        self.conv1 = nn.Conv2d(3,64,kernel_size=7,stride=2,padding=3,bias=False)
        self.bn1 = nn.BatchNorm2d(64)
        self.relu = nn.ReLU(inplace=True)
        self.maxpool = nn.MaxPool2d(kernel_size=3,stride=2,padding=1,ceil_mode=False)
        self.layer1 = self._make_layer(ResBlock,64,2,stride=1)
        self.layer2 = self._make_layer(ResBlock,128,2,stride=2)
        self.layer3 = self._make_layer(ResBlock,256,2,stride=2)
        self.layer4 = self._make_layer(ResBlock,512,2,stride=2)
        self.avgpool = nn.AdaptiveAvgPool2d(output_size=(1,1))
        self.fc = nn.Linear(512,num_classes)

    def forward(self,x):
        out = self.conv1(x)
        out = self.bn1(out)
        out = self.relu(out)
        #out = self.maxpool(out)
        #print("layer1:")
        out = self.layer1(out)
        #print("layer2:")
        out = self.layer2(out)
        #print("layer3:")
        out = self.layer3(out)
        #print("layer4:")
        out = self.layer4(out)
        out = self.avgpool(out)
        out = out.view(out.size(0), -1)
        out = self.fc(out)

        return out

    def _make_layer(self,block,channels,num_blocks,stride):
        strides = [stride] + [1] * (num_blocks - 1)
        layers = []
        for stride in strides:
            layers.append(block(self.inchannel, channels, stride))
            self.inchannel = channels
        return nn.Sequential(*layers)





if __name__=='__main__':
    model = ResNet18(ResBlock)
    print(model)

注意:由于我这个代码是在cifar10上测试的,cifar10的图片尺寸只有16x16像素,非常小。因此,我注释掉了out = self.maxpool(out)这一行。

如果在较大尺寸的图片上使用这段代码,最好把这行代码加回来,效果更好。

这是在cifar10上跑50轮时的结果,优化器是adam,lr=0.001.

  • 1
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 1
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

蓝海渔夫

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值