Information Theory II

  • Lecturer: Nihat Ay
  • Date: Wednesday 11.30 - 13.00; starting April 17
  • Room: MPI MiS A02
  • Language: English
  • Target audience: MSc students, PhD students, Postdocs
  • Content (Keywords): Information channels, transmission rate and capacity, Shannon's coding theorems
  • Prerequisites: Basic knowledge in probability and measure theory is required.

Abstract

In the winter term 2012/2013, I taught the first part of my course on information theory. I introduced basic information-theoretic quantities such as the entropy, the conditional entropy, the mutual information, and the relative entropy. Here, I tried to present a general measure-theoretic perspective, which provides strong tools for the treatment of information sources and information channels. Based on the developed information-theoretic quantities, I presented elementary results on the Kolmogorov-Sinai entropy of dynamical systems and proved the Shannon-McMillan-Breiman theorem. This theorem is the main theorem of the first part of my course and will serve as a prerequisite for the second part, which I am going to present during the summer term 2013. In this part, I will concentrate on information channels. I will introduce the transmission rate and capacity of information channels. The central theorems there will be Shannon's celebrated coding theorems. To this end, I will develop Feinstein's fundamental lemma, which constitutes, together with the Shannon-McMillan-Breiman theorem, the main tool for the proofs of Shannon's coding theorems.

References

  1. A. I. Khintchine. Mathematical Foundations of Information Theory. Dover, New York, 1958.
  2. Y. Kakihara. Abstract Methods in Information Theory. World Scientific, Singapore, 1999.
  3. P. Walters. An Introduction to Ergodic Theory. Springer, 1982.
05.04.2019, 15:53