Cross-Entropy
The Cross-Entropy between two probability distributions \(p\) and \(q\) measures the difference for a given random variable or set of events.
Definition
The Cross-entropy of the distribution \(q\) relative to a distribution \(p\) over a given set is defined as follows:
\[H(p, q) = -\mathbb{E}_p[\log q]\]where \(\mathbb{E}_p[\cdot]\) is the expected value operator with respect to the distribution \(p\). The definition may be formulated using the Kullback-Leibler Divergence \(KL[p \mid\mid q]\).
\[H(p, q) = H(p) + KL[p \mid\mid q]\]For discrete probability distributions:
\[H(P, Q) = -\sum_{x\in\mathcal{X}}p(x*i)\log{q(x_i)}\]For continuous probability distributions:
\[H(p, q) = -\int_{\mathcal{X}}P(x)\log{Q(x)}dr(x)\]