Αποτελέσματα Αναζήτησης
In information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable given that the value of another random variable is known. Here, information is measured in shannons, nats, or hartleys.
The joint entropy measures how much uncertainty there is in the two random variables X and Y taken together. Definition The conditional entropy of X given Y is. H(X|Y ) = −. p(x, y) log p(x|y) = −E[ log(p(x|y)) ] (5) The conditional entropy is a measure of how much uncertainty remains about the random variable.
Conditional entropy. Let (X;Y) ∼ p. For x ∈ Supp(X), the random variable YSX = x is well defined. The entropy of Y conditioned on X, is defined by. H(YSX) ∶= E H(YSX = x) = E H(YSX) x←X X. Measures the uncertainty in Y given X. Let pX & pYSX be the marginal & conational distributions induced by p. H(YSX) = Q pX(x) ⋅ H(YSX = x) x∈X.
h(x;y) = h(xjy) + h(y) = h(yjx) + h(x) In words, the joint entropy of Xand Y is the conditional entropy of X given Y, plus the marginal entropy of Y (and vice versa).
If \((X, Y) \sim p(x, y)\), the conditional entropy \(H(Y \mid X)\) is defined as \[\begin{split}H(Y \mid X) & = \sum_{x \in \mathcal{X}} p(x) H(Y \mid X = x) \\ & = - \sum_{x \in \mathcal{X}} p(x) \sum_{y \in \mathcal{Y}} p(y \mid x) \log p(y \mid x) \\ & = - \sum_{x \in \mathcal{X}}\sum_{y \in \mathcal{Y}} p(x, y) \log p(y \mid x) \\ & = - E ...
The naturalness of the definition of joint entropy and conditional entropy is exhibited by the fact that the entropy of a pair of random variables is the entropy of one plus the conditional entropy of the other. This is proved in the-following- theorem. Theorem 2.2.1 (Chain rule): H(X, Y) = H(X) + H(YJX) .
4 Μαρ 2022 · Figure 2 shows how we can visualize conditional entropy and mutual information. The red and blue pertain to the individual entropies H ( X ) {\displaystyle H(X)} and H ( Y ) {\displaystyle H(Y)} , respectively.