Search

Talk

Information Theory II

Abstract

This is a continuation of my Information Theory I course, which I offered in the winter term 2016/2017. I concluded with elementary results on the Kolmogorov-Sinai entropy of dynamical systems and proved the Shannon-McMillan-Breiman theorem for information sources. This theorem will serve as a prerequisite for the second part of my course, Information Theory II, in which I will concentrate on information channels. I will introduce the transmission rate and the capacity of information channels. The central theorems of this part will be Shannon's celebrated coding theorems. I will develop Feinstein's fundamental lemma, which constitutes, together with the Shannon-McMillan-Breiman theorem, the main tool for the proofs of Shannon's coding theorems.

References

  1. A. I. Khintchine. Mathematical Foundations of Information Theory. Dover, New York, 1958.
  2. Y. Kakihara. Abstract Methods in Information Theory. World Scientific, Singapore, 1999.
  3. P. Walters. An Introduction to Ergodic Theory. Springer, 1982.
  4. T. M. Cover, J. A. Thomas. Elements of Information Theory. Wiley, 2006.

Date and time info
Tuesday, 11:00 - 12:30

Keywords
Partial Differential Equations, Applications of PDEs in science

Prerequisites
Basic knowledge in probability and measure theory is required.

Audience
MSc students, PhD students, Postdocs

Language
English

Remarks and notes
This course consists of six lectures, which will take place on April 4, 25, and May 2, 9, 16, 23. In the first lecture on April 4, I will provide a brief summary of the basic results of Information Theory I that will be required for the course.

lecture
01.04.17 31.07.17

Regular lectures Summer semester 2017

MPI for Mathematics in the Sciences / University of Leipzig see the lecture detail pages

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail