问题:当使用torch.autograd.grad求梯度的时候,报错:RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn
比如简单的模型f(x)=xw+b
import torch
from torch.nn import functional as F
x = torch.ones(1)
w = torch.full([1], 2)
mse =
问题:当使用torch.autograd.grad求梯度的时候,报错:RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn
比如简单的模型f(x)=xw+b
import torch
from torch.nn import functional as F
x = torch.ones(1)
w = torch.full([1], 2)
mse =