File “/home/yy/anaconda3/envs/py37/lib/python3.7/site-packages/pytorch_pretrained_bert/modeling.py”, line 727, in forward
extended_attention_mask = extended_attention_mask.to(dtype=next(self.parameters()).dtype) # fp16 compatibility
多gpu, torch1.6版本问题:
方法:next(self.parameters()).dtype换成torch.float32