import torch.nn.functional as F
import numpy as np
a = torch.Tensor([1,2,3,4])
a = a.masked_fill(mask = torch.ByteTensor([1,1,0,0]), value=-np.inf)
print(a)
b = F.softmax(a)
print(b)
tensor([-inf, -inf, 3., 4.])
d:/pycharmdaima/star-transformer/ceshi.py:8: UserWarning: Implicit dimension choice for softmax has been deprecated. Change
the call to include dim=X as an argument.
b = F.softmax(a)
tensor([0.0000, 0.0000, 0.2689, 0.7311])
容易报错: Expected object of scalar type Byte but got scalar type Long for argument #2 'mask'
原因,mask = torch.LongTensor()
解决方法:mask = torch.ByteTensor()
在mask
值为1的位置处用value
填充。mask
的元素个数需和本tensor相同,但尺寸可以不同