pytorch计算模型参数量报错:size mismatch for module.conv1.weight: copying a param with shape torch.Size([16, 3

错误:

RuntimeError: Error(s) in loading state_dict for DataParallel:
	size mismatch for module.conv1.weight: copying a param with shape torch.Size([16, 3, 3, 3]) from checkpoint, the shape in current model is torch.Size([128, 128, 3, 3]).
	size mismatch for module.bn1.weight: copying a param with shape torch.Size([16]) from checkpoint, the shape in current model is torch.Size([128]).
	size mismatch for module.bn1.bias: copying a param with shape torch.Size([16]) from checkpoint, the shape in current model is torch.Size([128]).
	size mismatch for module.bn1.running_mean: copying a param with shape torch.Size([16]) from checkpoint, the shape in current model is torch.Size([128]).
	size mismatch for module.bn1.running_var: copying a param with shape torch.Size([16]) from checkpoint, the shape in current model is torch.Size([128]).
	size mismatch for module.bn2.weight: copying a param with shape torch.Size([64]) from checkpoint, the shape in current model is torch.Size([256]).
	size mismatch for module.bn2.bias: copying a param with shape torch.Size([64]) from checkpoint, the shape in current model is torch.Size([256]).
	size mismatch for module.bn2.running_mean: copying a param with shape torch.Size([64]) from checkpoint, the shape in current model is torch.Size([256]).
	size mismatch for module.bn2.running_var: copying a param with shape torch.Size([64]) from checkpoint, the shape in current model is torch.Size([256]).

解决方案:

在加载模型时加上上面这句话,因为是多GPU训练出来的模型。

model = nn.DataParallel(model).cuda()
model.load_state_dict(torch.load("C:\\Users\\83543\\Desktop\\model_best.pth.tar")['state_dict'],False)
评论 9
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值