$ Entropy\ H(X) = -\sum p(X)\log p(X) $
$ Information\ Gain\ I(X,Y)= H(X)-H(X|Y) $
参考:
博客园官方说明:https://www.cnblogs.com/cmt/p/3279312.html
在线LaTeX编辑器:https://private.codecogs.com/latex/eqneditor.php?lang=zh-cn
$ Entropy\ H(X) = -\sum p(X)\log p(X) $
$ Information\ Gain\ I(X,Y)= H(X)-H(X|Y) $
参考:
博客园官方说明:https://www.cnblogs.com/cmt/p/3279312.html
在线LaTeX编辑器:https://private.codecogs.com/latex/eqneditor.php?lang=zh-cn
转载于:https://www.cnblogs.com/yaoyaohust/p/11153072.html