Information Theory
- Nihat Ay
Abstract
This course is subdivided into two parts. In the first part, I will introduce basic information-theoretic quantities such as the entropy, the conditional entropy, the mutual information, and the relative entropy. I will highlight a measure-theoretic perspective, which provides strong tools for the treatment of information sources and information channels. Based on the developed information-theoretic quantities, I will present elementary results on the Kolmogorov-Sinai entropy of dynamical systems and prove the Shannon-McMillan-Breiman theorem. This theorem serves as a prerequisite for the second part of my course in which I will concentrate on information channels. I will introduce the transmission rate and the capacity of information channels. The central theorems of this part will be Shannon's celebrated coding theorems. I will develop Feinstein's fundamental lemma, which constitutes, together with the Shannon-McMillan-Breiman theorem, the main tool for the proofs of Shannon's coding theorems.
References
- A. I. Khintchine. Mathematical Foundations of Information Theory. Dover, New York, 1958.
- Y. Kakihara. Abstract Methods in Information Theory. World Scientific, Singapore, 1999.
- P. Walters. An Introduction to Ergodic Theory. Springer, 1982.
- T. M. Cover, J. A. Thomas. Elements of Information Theory. Wiley, 2006.
Date and time info
Tuesday 11:00 - 12:30
Keywords
Partial Differential Equations, Applications of PDEs in science
Prerequisites
Basic knowledge in probability and measure theory is required
Audience
MSc students, PhD students, Postdocs
Language
English