Mutual Information Always Non-negative
What is the simplest proof that mutual information is always non-negative? i.e., $I(X;Y)\ge0$
By definition, $$I(X;Y) = -\sum_{x \in X} \sum_{y \in Y} p(x,y) \log\left(\frac{p(x)p(y)}{p(x,y)}\right)$$ Now, negative logarithm is convex and $\sum_{x \in X} \sum_{y \in Y} p(x,y) = 1$, therefore, by applying Jensen Inequality we will get, $$I(X;Y) \geq -\log\left( \sum_{x \in X} \sum_{y \in Y} p(x,y) \frac{p(x)p(y)}{p(x,y)} \right) = -\log\left( \sum_{x \in X} \sum_{y \in Y} p(x)p(y)\right) = 0$$ Q.E.D