Pytorch多optimizer进行backward报错
Error: by an inplace operation
出现报错 RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation
同时使用了多个Optimizer,对多个loss进行backward
losses['d'].backward() # loss1
trainer_module.optimizer_d.step() # optim1
losses['g'].backward() # loss2
trainer_module.optimizer_g.step() # optim2
解决方法
No❌:
a1.backward() # loss1
a1.optim.step() # optim1
a2.backward() # loss2
a2.optim.step() # optim2
Yes ✔:
a1.backward() # loss1
a2.backward() # loss2
a1.optim.step() # optim1
a2.optim.step() # optim2
backward完全部,再step全部
Cite
https://discuss.pytorch.org/t/solved-pytorch1-5-runtimeerror-one-of-the-variables-needed-for-gradient-computation-has-been-modified-by-an-inplace-operation/90256/