Search

Talk

Mutual information in algorithmic information theory

  • Andrei Romashchenko (LIRMM, France)
A3 02 (Seminar room)

Abstract

The description complexity (a.k.a. Kolmogorov complexity) of a string, C(x), is defined as the length of the shortest program that prints this string x. Given a pair of strings x and y, we define the mutual information between them I(x:y) as C(x) + C(y) - C(x,y). Intuitively, the mutual information is a measure of correlation between two strings: the closer the correlation is, the bigger the mutual information. This notion has a clean formal definition, but it lacks a more explicit interpretation or a ``physical'' meaning: in the general case, we cannot find a string z of complexity I(x:y) that could be interpreted as a common part of x and y.

However, it turns out that the mutual information can be interpreted in terms of a communication protocol. It is a communication protocol with two parties, one having x and the other one having y, with interaction on a public channel. The aim of the protocol is to establish the longest shared secret key without revealing any information on this key to the eavesdropper. It turns out that for every pair of inputs x,y the optimal size of the key is equal to the mutual information between x and y.

In the talk we will discuss the context around this question (the invariance theorem for Kolmogorov complexity, the chain rule) and explain how the question about the mutual information can be translated in the language of combinatorial properties of graphs. Then, as long as time allows, we sketch the technical ideas behind the proof (randomness extractors, information inequalities, spectral bounds for graphs with good mixing properties).

The talk is based on joint works with Marius Zimand and Emirhan Gürpinar.

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail