Monday, September 14, 2009

Some key concepts in information theory

Entropy:
H(X) = \sum_{i=1}^n {p(x_i)\,I(x_i)} = -\sum_{i=1}^n {p(x_i) \log_b p(x_i)},
Joint Entropy:
H(X,Y) = -\sum_{x,y} p_{x,y} \log_2(p_{x,y}) \!
Conditional Entropy:
\begin{align} H(Y|X)\ &\stackrel{\mathrm{def}}{=}\sum_{x\in\mathcal X}\,p(x)\,H(Y|X=x)\\ &{=}-\sum_{x\in\mathcal X}p(x)\sum_{y\in\mathcal Y}\,p(y|x)\,\log\,p(y|x)\\ &=-\sum_{x\in\mathcal X}\sum_{y\in\mathcal Y}\,p(y,x)\,\log\,p(y|x)\\ &=-\sum_{x\in\mathcal X, y\in\mathcal Y}p(x,y)\log\,p(y|x). \end{align}
Mutual Information:
 I(X;Y) = \sum_{y \in Y} \sum_{x \in X}                   p(x,y) \log{ \left( \frac{p(x,y)}{p_1(x)\,p_2(y)}                               \right) }, \,\!
\begin{align} I(X;Y) & {} = H(X) - H(X|Y) \\  & {} = H(Y) - H(Y|X) \\  & {} = H(X) + H(Y) - H(X,Y) \\ & {} = H(X,Y) - H(X|Y) - H(Y|X) \end{align}

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.