如果加载的预训练模型之前使用了torch.nn.DataParallel()
,而此时的训练并没有使用,则会出现这样的错误。
解决方案有两个:
1:此时的训练加入torch.nn.DataParallel()
即可。
2:创建一个没有module.
的新字典,即将原来字典中module.
删除掉。
解决方案1:
model = torch.nn.DataParallel(model)
cudnn.benchmark = True
- 1
- 2
解决方案2:
# original saved file with DataParallel
state_dict = torch.load('myfile.pth')
# create new OrderedDict that does not contain `module.`
from collections import OrderedDict
new_state_dict = OrderedDict()
for k, v in state_dict.items():
name = k[7:] # remove `module.`
new_state_dict[name] = v
# load params
model.load_state_dict(new_state_dict)
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
解决方案3:
model.load_state_dict({k.replace('module.',''):v for k,v in torch.load('myfile.pth').items()})
- 1
You probably saved the model using nn.DataParallel
, which stores the model in module
, and now you are trying to load it without . You can either add a nn.DataParallel
temporarily in your network for loading purposes, or you can load the weights file, create a new ordered dict without the module prefix, and load it back.
参考:https://discuss.pytorch.org/t/solved-keyerror-unexpected-key-module-encoder-embedding-weight-in-state-dict/1686/3
</div>
<link href="https://csdnimg.cn/release/phoenix/mdeditor/markdown_views-e44c3c0e64.css" rel="stylesheet">
</div>