Αποτελέσματα Αναζήτησης
Definition The conditional entropy of X given Y is. H(X|Y ) = −. p(x, y) log p(x|y) = −E[ log(p(x|y)) ] (5) The conditional entropy is a measure of how much uncertainty remains about the random variable.
In information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable given that the value of another random variable is known. Here, information is measured in shannons, nats, or hartleys.
conditional information, and relative entropy (discrimination, Kullback-Leibler information), along with the limiting normalized versions of these quantities such as entropy rate and information rate.
De nition 8.2 (Conditional entropy) The conditional entropy of a random variable is the entropy of one random variable conditioned on knowledge of another random variable, on average. Alternative interpretations: the average number of yes/no questions needed to identify X given knowledge of.
Conditional entropy. Let (X;Y) ∼ p. For x ∈ Supp(X), the random variable YSX = x is well defined. The entropy of Y conditioned on X, is defined by. H(YSX) ∶= E H(YSX = x) = E H(YSX) x←X X. Measures the uncertainty in Y given X. Let pX & pYSX be the marginal & conational distributions induced by p. H(YSX) = Q pX(x) ⋅ H(YSX = x) x∈X.
Consider a random variable X that takes values a, b, c, and d with probabilities 1=3, 1=3, 1=4, and 1=12, respectively. A Shannon code would encode a, b, c, and d with 2, 2, 2, and 4 bits, respectively. On the other hand, there is an optimal Hu man code encoding a, b, c, and d with 1, 2, 3, and 3 bits respectively.
Uniform distribution has maximum entropy among all distributions with nite discrete support. Theorem. H(X) logjXj, where X is the number of elements in the set. Equality i X has uniform distribution. Proof: Let U be a uniform distributed RV, u(x) = 1=jXj 0 D(pjju) = ∑ p(x)log p(x) u(x) (1) = ∑ p(x)logjXj (∑ p(x)logp(x)) = logjXj H(X) (2)