torch1.6好像不支持了
DataParallel 笔记_jacke121的专栏-CSDN博客
保存模型:
import torch
if __name__ == '__main__':
aaaa=[]
model_path="best.pth"
if isinstance(aaaa,torch.nn.parallel.DistributedDataParallel):
torch.save(aaaa.module.state_dict(),model_path)
else:
torch.save(aaaa.state_dict(), model_path)
其他例子: