Information Theory

  • Lecturer: Nihat Ay
  • Date: Tuesday 11:00 - 12:30
  • Room: MPI MiS, A3 02
  • Language: English
  • Target audience: MSc students, PhD students, Postdocs
  • First lecture: November 2016
  • Keywords: Partial Differential Equations, Applications of PDEs in science
  • Prerequisites: Basic knowledge in probability and measure theory is required

Abstract

This course is subdivided into two parts. In the first part, I will introduce basic information-theoretic quantities such as the entropy, the conditional entropy, the mutual information, and the relative entropy. I will highlight a measure-theoretic perspective, which provides strong tools for the treatment of information sources and information channels. Based on the developed information-theoretic quantities, I will present elementary results on the Kolmogorov-Sinai entropy of dynamical systems and prove the Shannon-McMillan-Breiman theorem. This theorem serves as a prerequisite for the second part of my course in which I will concentrate on information channels. I will introduce the transmission rate and the capacity of information channels. The central theorems of this part will be Shannon's celebrated coding theorems. I will develop Feinstein's fundamental lemma, which constitutes, together with the Shannon-McMillan-Breiman theorem, the main tool for the proofs of Shannon's coding theorems.

References

  1. A. I. Khintchine. Mathematical Foundations of Information Theory. Dover, New York, 1958.
  2. Y. Kakihara. Abstract Methods in Information Theory. World Scientific, Singapore, 1999.
  3. P. Walters. An Introduction to Ergodic Theory. Springer, 1982.
  4. T. M. Cover, J. A. Thomas. Elements of Information Theory. Wiley, 2006.

 

Regular Lectures (Winter 2016/2017)

15.10.2018, 13:54