下面一篇文章在例子中直观通俗理解KL散度:
Kullback-Leibler Divergence Explained
Light on Math Machine Learning: Intuitive Guide to Understanding KL
上文中文翻译链接:https://www.sohu.com/a/233776078_164987
知乎回答:https://www.zhihu.com/question/29980971
自我理解通俗一句话:衡量两个分布(一般是:真实分布的采样;现有的分布类型,后者拟合前者)之间的相似度。
应用实例(起因):KL Penalty
使用KL散度匹配两个新老policy之间相似度,如果差异太大进行重的惩罚;差异不太大进行轻的惩罚。
其他参考文章:
KL散度(Kullback-Leibler_divergence)
https://zr9558.com/2015/11/17/kullback-leibler-divergence/
https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence
https://www.cnblogs.com/silent-stranger/p/7987708.html
注:
图片1来源:Light on Math Machine Learning: Intuitive Guide to Understanding KL
图片2来源:https://morvanzhou.github.io/tutorials/machine-learning/reinforcement-learning/6-4-DPPO/