Preprint 10/2015

The homological nature of entropy

Pierre Baudot and Daniel Bennequin

Contact the author: Please use for correspondence this email.
Submission date: 12. Feb. 2015 (revised version: May 2015)
Pages: 66
published in: Entropy, 17 (2015) 5, p. 3253-3318 
DOI number (of the published article): 10.3390/e17053253
Bibtex
MSC-Numbers: 20J06, 94A17, 81P45, 18D50, 62F15, 14F35
Keywords and phrases: Shannon information, Homology Theory, entropy, Quantum Information, Homotopy of Links, Mutual Informations, Kullback-Leiber divergence, Trees
Download full preprint: PDF (657 kB)

Abstract:
We propose that entropy is a universal co-homological class in a theory associated to a family of observable quantities and a family of probability distributions. Three cases are presented: 1) classical probabilities and random variables; 2) quantum probabilities and observable operators; 3) dynamic probabilities and observation trees. This gives rise to a new kind of topology for information processes, that accounts for the main information functions: entropy, mutual-informations at all orders, and Kullback-Leibler divergence and generalises them in several ways. The article is divided into two parts, that can be read independently. In the first part, the introduction, we provide an overview of the results, some open questions, future results and lines of research, and discuss briefly the application to complex data. In the second part we give the complete definitions and proofs of the theorems 1, 3 and 5 in the introduction, which show why entropy is the first homological invariant of a structure of information in four contexts: static classical or quantum probability, dynamics of classical or quantum strategies of observation of a finite system.

18.10.2019, 02:15