报错信息:
Traceback (most recent call last):
File "/users/anaconda3/lib/python3.6/site-packages/torch/nn/modules/module.py", line 489, in __call__
result = self.forward(*input, **kwargs)
File "/users/anaconda3/lib/python3.6/site-packages/torch/nn/modules/loss.py", line 504, in forward
return F.binary_cross_entropy(input, target, weight=self.weight, reduction=self.reduction)
File "/users4/zsun/anaconda3/lib/python3.6/site-packages/torch/nn/functional.py", line 2027, in binary_cross_entropy
input, target, weight, reduction_enum)
RuntimeError: the derivative for 'target' is not implemented
出错代码为:
torch.nn.BCELoss(preds, targets)
错误为:
loss函数的target,即第二项targets的导数不应该有梯度
查错误:
print(preds.requires_grad,targets.requires_grad);
发现结果为True,True
然而loss函数的preds.requires_grad=True,targets.requires_grad应该为False
改正:
加入一行targets = targets.detach()
NOTE:
targets.requires_grad = False不可以
原因是:
you can only change requires_grad flags of leaf variables.
If you want to use a computed variable in a subgraph that doesn't require differentiation please use detach()