Information Theory II

  • Lecturer: Nihat Ay
  • Date: Tuesday, 11:00 - 12:30
  • Room: MPI MiS A3 02
  • Language: English
  • Target audience: MSc students, PhD students, Postdocs
  • Keywords: Partial Differential Equations, Applications of PDEs in science
  • Prerequisites: Basic knowledge in probability and measure theory is required.
  • Remarks: This course consists of six lectures, which will take place on April 4, 25, and May 2, 9, 16, 23. In the first lecture on April 4, I will provide a brief summary of the basic results of Information Theory I that will be required for the course.

Abstract:

This is a continuation of my Information Theory I course, which I offered in the winter term 2016/2017. I concluded with elementary results on the Kolmogorov-Sinai entropy of dynamical systems and proved the Shannon-McMillan-Breiman theorem for information sources. This theorem will serve as a prerequisite for the second part of my course, Information Theory II, in which I will concentrate on information channels. I will introduce the transmission rate and the capacity of information channels. The central theorems of this part will be Shannon's celebrated coding theorems. I will develop Feinstein's fundamental lemma, which constitutes, together with the Shannon-McMillan-Breiman theorem, the main tool for the proofs of Shannon's coding theorems.

References:

  1. A. I. Khintchine. Mathematical Foundations of Information Theory. Dover, New York, 1958.
  2. Y. Kakihara. Abstract Methods in Information Theory. World Scientific, Singapore, 1999.
  3. P. Walters. An Introduction to Ergodic Theory. Springer, 1982.
  4. T. M. Cover, J. A. Thomas. Elements of Information Theory. Wiley, 2006.

Regular Lectures (Summer 2017)

15.10.2018, 13:53