1.Pack_padded_sequenceValueError
解决方法:
https://discuss.pytorch.org/t/char-lstm-for-sequence-tagging/12802
https://discuss.pytorch.org/t/pack-padded-sequence-valueerror/3261/3
input data是LongTensor, Pack_padded_sequence 的参数input_data需要在此之前将LongTensor转为FloatTensor, reshape the input_var intoshape seq length X batch X 1. 并且每个batch中input data是按照length由大到小排列的。
length需要在Pack_padded_sequence引用时用length.data().numpy()
2. LSTM, RuntimeError:bool value of Variable objects containing non-empty torch.LongTensor isambiguous
input_data需要在此之前将LongTensor转为FloatTensor, reshape the input_var into shape seqlength X batch X 。
3.pad_packed_sequence:不同batch得到的unpacked_out_x的size不同,无法进行后面的一致性的操作