【问题原因】pytorch自身原因,参考fix pack_padded_sequence() by antoniogois · Pull Request #119 · joeynmt/joeynmt · GitHub
【解决方法】
packed = pack_padded_sequence(embed_src, src_length, batch_first=True)
# 改为
packed = pack_padded_sequence(embed_src, src_length.cpu(), batch_first=True)