问题记录:
UserWarning: reduction: 'mean' divides the total loss by both the batch size and the support size.'batchmean' divides only by the batch size, and aligns with the KL div math definition.'mean' will be changed to behave the same as 'batchmean' in the next major release."reduction: 'mean' divides the total loss by both the batch size and the support size."
错误原因:
KLDistanceLoss的reduction被设置为mean
解决方法:
根据代码架构有两种解决方法:
1、将loss文件里的KLDistanceLoss中,设置为reduction = 'batchmean'
2、如果无效,查看xml文件里是否对KLDistanceLoss进行了定义,将reduction设置为batchmean即可。
selfsim1_opt:
type: KLDistanceLoss
loss_weight: !!float 1e3
reduction: batchmean
softmax: False
欢迎大家交流自己的解决方案。