Shannon entropy 香农熵
The key concept of classical information theory is the Shannon entropy. Suppose we learn the value of a random variable X. The Shannon entropy of X quantifies how much information we gain, on average, when we learn the value of X. An alternative view is that the entropy of X measures the amount of uncertainty about X before we learn its value. These two views are complementary; we can view the entropy either as a measure of our uncertainty before we learn the value of X, or as a measure of how much information we have gained after we learn the value of X.
We often write the entropy as a function of a probability distribution, p1,⋯,pn . The Shannon entropy associated with this probability distribution is defined by
The relative entropy 相对熵
The relative entropy is a very useful entropy-live measure of the closeness of two probability distributions,
p(x)
and
q(x)
, over the same index set,
x
. Suppose
The cross entropy 交叉熵
Reference
[1] Nielsen M A, Chuang I L. Quantum computation and quantum information[M]. Cambridge university press, 2010.
[2] http://blog.csdn.net/rtygbwwwerr/article/details/50778098