import torch import numpy as np input = torch.autograd.Variable(torch.rand(1, 3)) print(input) print('softmax={}'.format(torch.nn.functional.softmax(input, dim=1))) print('logsoftmax={}'.format(np.log(torch.nn.functional.softmax(input, dim=1))))
基于pytorch, softmax,logsoftmax 表达
最新推荐文章于 2024-06-18 16:52:06 发布