Information Theory: Self-Information, Entropy, Relative entropy,Cross entropy, Conditinal Entropy
Self-Information: I(x)=log1P(x)I(x) = \log \frac{1}{P(x)}I(x)=logP(x)1Entropy :H(X)=E[I(X)]=E(log1P(X))=∑x∈XP(x)log1P(x)H(X)=E[I(X)] =E(\log \frac{1}{P(X)})=\sum_{x \in X} P(x)\log \frac{1}{P(x)}...
原创
2019-04-13 02:17:47 ·
199 阅读 ·
0 评论