Search

Talk

Information Theory II

Abstract

In the winter term 2012/2013, I taught the first part of my course on information theory. I introduced basic information-theoretic quantities such as the entropy, the conditional entropy, the mutual information, and the relative entropy. Here, I tried to present a general measure-theoretic perspective, which provides strong tools for the treatment of information sources and information channels. Based on the developed information-theoretic quantities, I presented elementary results on the Kolmogorov-Sinai entropy of dynamical systems and proved the Shannon-McMillan-Breiman theorem. This theorem is the main theorem of the first part of my course and will serve as a prerequisite for the second part, which I am going to present during the summer term 2013. In this part, I will concentrate on information channels. I will introduce the transmission rate and capacity of information channels. The central theorems there will be Shannon's celebrated coding theorems. To this end, I will develop Feinstein's fundamental lemma, which constitutes, together with the Shannon-McMillan-Breiman theorem, the main tool for the proofs of Shannon's coding theorems.

References

  • A. I. Khintchine. Mathematical Foundations of Information Theory. Dover, New York, 1958.
  • Y. Kakihara. Abstract Methods in Information Theory. World Scientific, Singapore, 1999.
  • P. Walters. An Introduction to Ergodic Theory. Springer, 1982.

Date and time info
Wednesday 11.30 - 13.00

lecture
01.04.13 31.07.13

Regular lectures Summer semester 2013

MPI for Mathematics in the Sciences / University of Leipzig see the lecture detail pages

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail