views:

74

answers:

1

I know about Wikipedia and MacKay's Information Theory, Inference, and Learning Algorithms (is it appropriate as textbook?). A textbook starting with Shannon's entropy and going through Conditional entropy and Mutual information is sought... Any idea? If you are following such a course at your university, which textbook is used?

Thanks.

+1  A: 

I used the following textbook during my studies in CS at EPFL. IMO, it's well written, with good explainations, and covers more than enough for an introduction to the domain.

Elements of Information Theory

Elements of Information Theory

EDIT: For further reading, here are some other readings that my professor did recommend. I did not read them (shame on me), so I can't say if they're good or not.

  1. R. G. Gallager, Information Theory and Reliable Communication, Wiley, 1968.
  2. D. MacKay, Information Theory, Inference & Learning Algorithms, Cambridge University Press, 2008. (you already mentioned it)
  3. I. Csiszar and J. Korner, Information Theory: Coding Theorems for Discrete Memoryless Systems, Akademiai Kiado, 1997.
  4. C. E. Shannon, The Mathematical Theory of Communication
Wookai
Yup, pretty much the way to go. For an introductory treatment, stick to chapter 2.
Steve
Ok, thanks. If the question would be on "advanced treatment of Information Theory", which is you answer?
lmsasu
Added some other recommended readings from my professor.
Wookai
Chapter 2 of Cover/Thomas is large enough, and a complete understanding of that chapter is *necessary* (or equivalent material from another book, of course). But if you must go further, I'd say chapters 4, 5, and 7; then 8 and 9; then 10.
Steve