# Mutual Information and Conditional Mutual Information

Let $\text{I}(X;Y)$ represent the mutual information between random variables X and Y.

Here I give example one for the following inequalities.

1. $\text{I}(X;Y|Z) \geq \text{I}(X;Y)$
2. $\text{I}(X;Y|Z) \leq \text{I}(X;Y)$

Example 1: For $\text{I}(X;Y|Z) \geq \text{I}(X;Y)$

Let X, Y be independent uniformly distributed random bits and Z=X $\bigoplus$ Y. Then,

LHS = ${\text{I}(X;Y|Z=0) + \text{I}(X;Y|Z=1) \over 2}$

=${{1 + 1} \over 2}$ =1

RHS = 0

Hence, $\text{I}(X;Y|Z) \geq \text{I}(X;Y)$.

Example 2: For $\text{I}(X;Y|Z) \leq \text{I}(X;Y)$

Let X=Y=Z, X is a uniformly distributed random bit. Then, it can be shown that

LHS = 0

RHS= 1