问题描述
pytorch 运行深度学习代码,F.cross_entropy
出现问题:
Traceback (most recent call last):
File "main.py", line 217, in <module>
acc = test_clean()
File "main.py", line 201, in test_clean
loss = criterion(outputs, targets)
File "/home/kaiyuan/anaconda3/envs/torch/lib/python3.8/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
result = self.forward(*input, **kwargs)
File "/home/kaiyuan/anaconda3/envs/torch/lib/python3.8/site-packages/torch/nn/modules/loss.py", line 961, in forward
return F.cross_entropy(input, target, weight=self.weight,
File "/home/kaiyuan/anaconda3/envs/torch/lib/python3.8/site-packages/torch/nn/functional.py", line 2468, in cross_entropy
return nll_loss(log_softmax(input, 1), target, weight, None, ignore_index, None, reduction)
File "/home/kaiyuan/anaconda3/envs/torch/lib/python3.8/site-packages/torch/nn/functional.py", line 1605, in log_softmax
ret = input.log_softmax(dim)
AttributeError: 'tuple' object has no attribute 'log_softmax'
原因分析
一般来说,这个 criterion
前面跟的都是前向传播的过程,这个前向传播的输出一般都是 outputs。但是在有些情况下,输出不仅是outputs。比如 inception-v3 输出还有个 aux
,即输出为 (outputs, aux)
,当之前的输出仍然用outputs来接的话,相当于收到的是一个 tuple
,再送进 criterion
就出错了。所以这个问题一般是前面 feed forward 的问题,和 criterion 无关。
解决办法
在之前多放一个变量接就好
参考链接