结论
使用 torch.abs 求绝对值并不会打断计算图使得反向求导出错。
示例
>>> import torch
>>> import torch.nn as nn
>>> inp = torch.randn(1, 1, 4, 4)
>>> conv = nn.Conv2d(1, 1, 3)
>>> out = torch.abs(conv(inp)) #在前向传播中使用abs函数
>>> loss = torch.mean(out)
>>> loss.backward() #反向传播求导
>>> conv.weight.grad.data #梯度被正常计算出来了
tensor([[[[ 0.9184, -0.7436, 0.9369],
[ 0.1592, -0.9278, 0.2882],
[ 0.1048, -0.6844, -0.0950]]]])
>>> out = torch.abs(conv(inp)) #在前向传播中使用abs函数
>>> loss = torch.mean(out)
>>> loss.backward() #再一次反向传播求导
>>> conv.weight.grad.data #梯度还是被正常计算出来了
tensor([[[[ 1.8369, -1.4872, 1.8738],
[ 0.3184, -1.8557, 0.5765],
[ 0.2097, -1.3688, -0.1900]]]])