views:

195

answers:

1

Hi,

Hoping someone can give me some pointers with this entropy problem.

Say X is chosen randomly from the uniform integer distribution 0-32 (inclusive).

I calculate the entropy, H(X) = 32 bits, as each Xi has equal probability of occurring.

Now, say the following pseudocode executes.

int r = rand(0,1); // a random integer 0 or 1

r = r * 33 + X;

How would I work out the mutual information between the two variables r and X?

Mutual Information is defined as I(X; Y) = H(X) - H(X|Y) but I don't really understand how to apply the conditional entropy H(X|Y) to this problem.

Thanks

+1  A: 

If this is homework, then I'll give hints. (Also, I'll suppose X ~ unif[0,31] so the numbers work out nice. Are you sure it's [0,32] and not [0,31]?)

First, check your calculation for H(X). H(X) = 5, not 32.

Second, the equation r = 33r + X makes no sense. Let me use different variables:

Y = 32R + X

Third, you don't state the probability distribution of R. Assuming 0 and 1 are equiprobable, then H(R) = 1.

As you said, I(X;Y) = H(X) - H(X|Y) = H(Y) - H(Y|X). Consider H(Y|X). If X is given, i.e. held constant, then Y only depends on R, right? Therefore, H(Y|X) = H(R).

Finally, use the law of total probability to compute the probability distribution of Y. (Hint: it's a simple one.) Then you can compute H(Y), and finally, I(X;Y).

Steve