information-theory

Mutual Information / Entropy Calculation Help

Hi, Hoping someone can give me some pointers with this entropy problem. Say X is chosen randomly from the uniform integer distribution 0-32 (inclusive). I calculate the entropy, H(X) = 32 bits, as each Xi has equal probability of occurring. Now, say the following pseudocode executes. int r = rand(0,1); // a random integer 0 or 1 r ...

What is a parity check matrix? (Information theory)

Hi all, I'm studying information theory but one thing I can't seem to work out. I know that given a linear code C and a generator matrix M I can work out all the possible codewords of C. However I do not understand: What a parity check matrix is http://en.wikipedia.org/wiki/Parity-check_matrix How to make a parity check matrix from ...

How do I compute the approximate entropy of a bit string?

Is there a standard way to do this? Googling -- "approximate entropy" bits -- uncovers multiple academic papers but I'd like to just find a chunk of pseudocode defining the approximate entropy for a given bit string of arbitrary length. (In case this is easier said than done and it depends on the application, my application involves 16...

Compressibility Example

From my algorithms textbook: The annual county horse race is bringing in three thoroughbreds who have never competed against one another. Excited, you study their past 200 races and summarize these as probability distributions over four outcomes: first (“first place”), second, third, and other. Outcome Au...

Information theory numerical problem, steady state probability

<(if this is not the right place please direct me from where i can get help on this thank you)> calculate the steady state probability of source emitting a "0" from "Applied coding and information theory for engineers" page 70, chapter 2, example 2.4.4 pi0=1/9 pi1=pi2=2/9 pi3=4/9 For Pr(0) how are the values for Pr(0/Sn) found, by my...