Joint entropy
910967
224705996
2008-07-10T01:28:16Z
219.107.238.194
/* References */ +ja
The '''joint entropy''' is an [[information entropy|entropy measure]] used in [[information theory]]. The joint entropy measures how much [[entropy (information theory)|entropy]] is contained in a joint system of two [[random variables]]. If the random variables are <math>X</math> and <math>Y</math>, the joint entropy is written <math>H(X,Y)</math>. Like other entropies, the joint entropy can be measured in [[bit]]s, [[Nit_(unit_of_information)|nit]]s, or [[ban (information)|hartley]]s depending on the base of the [[logarithm]].
==Background==
Given a random variable <math>X</math>, the entropy <math>H(X)</math> describes our uncertainty about the value of <math>X</math>. If <math>X</math> consists of several events <math>x</math>, which each occur with probability <math>p_x</math>, then the entropy of <math>X</math> is
:<math>H(X) = -\sum_x p_x \log_2(p_x) \!</math>
Consider another random variable <math>Y</math>, containing [[event (probability theory)|event]]s <math>y</math> occurring with probabilities <math>p_y</math>. <math>Y</math> has entropy <math>H(Y)</math>.
However, if <math>X</math> and <math>Y</math> describe related events, the total entropy of the system may not be <math>H(X)+H(Y)</math>. For example, imagine we choose an [[integer]] between 1 and 8, with equal probability for each integer. Let <math>X</math> represent whether the integer is [[even and odd numbers|even]], and <math>Y</math> represent whether the integer is [[prime number|prime]]. One-half of the integers between 1 and 8 are even, and one-half are prime, so <math>H(X)=H(Y)=1</math>. However, if we know that the integer is even, there is only a 1 in 4 chance that it is also prime; the distributions are related. The total entropy of the system is less than 2 bits. We need a way of measuring the total entropy of both systems.
==Definition==
We solve this by considering each ''pair'' of possible outcomes <math>(x,y)</math>. If each pair of outcomes occurs with probability <math>p_{x,y}</math>, the joint entropy is defined as
:<math>H(X,Y) = -\sum_{x,y} p_{x,y} \log_2(p_{x,y}) \!</math>
In the example above we are not considering 1 as a prime. Then the joint probability distribution becomes:
<math>P(even,prime)=P(odd,not prime)=1/8 \quad </math>
<math>P(even,not prime)=P(odd,prime)=3/8 \quad </math>
Thus, the joint entropy is
<math> -2\frac{1}{8}\log_2(1/8) -2\frac{3}{8}\log_2(3/8) \approx 1.8 </math> bits.
==Properties==
===Greater than subsystem entropies===
The joint entropy is always at least equal to the entropies of the original system; adding a new system can never reduce the available uncertainty.
:<math>H(X,Y) \geq H(X)</math>
This inequality is an equality if and only if <math>Y</math> is a (deterministic) [[function (mathematics)|function]] of <math>X</math>.
if <math>Y</math> is a (deterministic) [[function (mathematics)|function]] of <math>X</math>, we also have
:<math>H(X) \geq H(Y)</math>
===Subadditivity===
Two systems, considered together, can never have more entropy than the sum of the entropy in each of them. This is an example of [[subadditivity]].
:<math>H(X,Y) \leq H(X) + H(Y)</math>
This inequality is an equality if and only if <math>X</math> and <math>Y</math> are [[statistically independent]].
===Bounds===
Like other entropies, <math>H(X,Y) \geq 0</math> always.
==Relations to Other Entropy Measures==
The joint entropy is used in the definitions of the [[conditional entropy]]:
:<math>H(X|Y) = H(X,Y) - H(Y)\,</math>
and the [[mutual information]]:
:<math>I(X;Y) = H(X) + H(Y) - H(X,Y)\,</math>
In [[quantum information theory]], the joint entropy is generalized into the [[joint quantum entropy]].
==References==
# {{cite book |author=Theresa M. Korn; Korn, Granino Arthur |title=Mathematical Handbook for Scientists and Engineers: Definitions, Theorems, and Formulas for Reference and Review |publisher=Dover Publications |location=New York |year= |pages=613-614 |isbn=0-486-41147-8 |oclc= |doi=}}
[[Category:Entropy and information]]
[[de:Blockentropie]]
[[fr:Entropie conjointe]]
[[ja:結合エントロピー]]