from torch.nn import functional as F
import torch
x=torch.randn(1,10)
w=torch.randn(1,10,requires_grad=True)
o=torch.sigmoid(x@w.t())
loss=F.mse_loss(torch.ones(1,1),o)
loss.backward()
print("x:",x)
print("w:",w)
print("o:",o)
print("loss:",loss)
print("w.grad:",w.grad)
Loss及其梯度的计算
最新推荐文章于 2022-08-05 09:44:51 发布