cross entropy
is measurement of probability distributions p
and q
over the same underlying random variable.
H ( p , q ) = − ∑ x i ∈ X p ( x i ) l o g q ( x i ) H(p, q) = -\sum_{x_i \in X} p(x_i)log^{q(x_i)} H(p,q)=−xi∈X∑p(xi)logq(xi)
Speaking of classification problem, the random variable X X X represents the probable category of a instance. p ( x i ) p(x_i) p(xi) or q ( x i ) q(x_i) q(xi) is the probability that the instance is belonged to category