pytorch 报错:Trying to backward through the graph a second time, but the buffers have already been freed. Specify retain_graph=True when calling backward the first time.
pytorch 报错:Trying to backward through the graph a second time, but the buffers have already been fre
pytorch 报错:Trying to backward through the graph a second time, but the buffers have already been freed. Specify retain_graph=True when calling backward the first time.写深度学习项目时第一次遇到这个问题,原因是因为写了多个网络结构,且将前面的网络的输出输入到了后面的网络结构,因此前面的网络结构和后面存在buffer的重叠。解决办法:在先前