Search

Talk

Information Theory

Abstract

This course is subdivided into two parts. In the first part, I will introduce basic information-theoretic quantities such as the entropy, the conditional entropy, the mutual information, and the relative entropy. I will highlight a measure-theoretic perspective, which provides strong tools for the treatment of information sources and information channels. Based on the developed information-theoretic quantities, I will present elementary results on the Kolmogorov-Sinai entropy of dynamical systems and prove the Shannon-McMillan-Breiman theorem. This theorem serves as a prerequisite for the second part of my course in which I will concentrate on information channels. I will introduce the transmission rate and the capacity of information channels. The central theorems of this part will be Shannon's celebrated coding theorems. I will develop Feinstein's fundamental lemma, which constitutes, together with the Shannon-McMillan-Breiman theorem, the main tool for the proofs of Shannon's coding theorems.

References

  1. A. I. Khintchine. Mathematical Foundations of Information Theory. Dover, New York, 1958.
  2. Y. Kakihara. Abstract Methods in Information Theory. World Scientific, Singapore, 1999.
  3. P. Walters. An Introduction to Ergodic Theory. Springer, 1982.
  4. T. M. Cover, J. A. Thomas. Elements of Information Theory. Wiley, 2006.

Date and time info
Tuesday 11:00 - 12:30

Keywords
Partial Differential Equations, Applications of PDEs in science

Prerequisites
Basic knowledge in probability and measure theory is required

Audience
MSc students, PhD students, Postdocs

Language
English

lecture
01.10.16 31.01.17

Regular lectures Winter semester 2016-2017

MPI for Mathematics in the Sciences / University of Leipzig see the lecture detail pages

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail