Mutual Information and Conditional Mutual Information

Let \text{I}(X;Y) represent the mutual information between random variables X and Y.

Here I give example one for the following inequalities.

  1. \text{I}(X;Y|Z) \geq \text{I}(X;Y)
  2. \text{I}(X;Y|Z) \leq \text{I}(X;Y)

Example 1: For \text{I}(X;Y|Z) \geq \text{I}(X;Y)

Let X, Y be independent uniformly distributed random bits and Z=X \bigoplus Y. Then,

LHS = {\text{I}(X;Y|Z=0) + \text{I}(X;Y|Z=1) \over 2}

={{1 + 1} \over 2} =1

RHS = 0

Hence, \text{I}(X;Y|Z) \geq \text{I}(X;Y).

Example 2: For \text{I}(X;Y|Z) \leq \text{I}(X;Y)

Let X=Y=Z, X is a uniformly distributed random bit. Then, it can be shown that

LHS = 0

RHS= 1

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s