[1] Which Training Methods for GANs do actually Converge?
Lars Mescheder, Andreas Geiger, Sebastian Nowozin
MPI, ETH, Microsoft Research
http://proceedings.mlr.press/v80/mescheder18a/mescheder18a.pdf
不同方法的收敛特性对比如下
基于梯度下降的GAN优化算法通常不收敛的示例如下
Dirac-GAN定义如下
不同方法对比如下
几种方法的收敛特性对比如下
代码地址
https://github.com/LMescheder/GAN_stability
我是分割线
[2] Orthogonal Recurrent Neural Networks with Scaled Cayley Transform
Kyle E. Helfrich, Devin Willmott, Qiang Ye
University of Kentucky
http://proceedings.mlr.press/v80/helfrich18a/helfrich18a.pdf
各方法对比如下