Preprint 58/2003

Multi-information in the thermodynamic limit

Ionas Erb and Nihat Ay

Contact the author: Please use for correspondence this email.
Submission date: 08. Jul. 2003
Pages: 37
published in: Journal of statistical physics, 115 (2004) 3-4, p. 949-976 
DOI number (of the published article): 10.1023/B:JOSS.0000022375.49904.ea
Bibtex
MSC-Numbers: 60K35, 79XX, 60XX
Keywords and phrases: mutual information, ising model, phase transitions, complexity
Download full preprint: PDF (719 kB), PS ziped (274 kB)

Abstract:
From information theory, mutual information is known to measure stochastic interdependence of probability distributions with two subsystems. We use a generalised version of this measure: multi-information, the Kullback-Leibler distance of a distribution from its corresponding independent distribution, and give a definition within the framework of statistical mechanics. There, the theory of infinite-volume Gibbs measures allows for the description of phase coexistence: The interaction potential of a model can yield several Gibbs measures at the same time. We propose to take the least multi-information of all the translation-invariant Gibbs measures to define a quantity directly depending on the interaction potential. We show that it is enough to take this infimum over the pure, i.e. physically relevant states only. Our definition is applied to the two-dimensional Ising model and the main result is derived: In the Ising square lattice, multi-information as a function of temperature attains its isolated global maximum at the point of phase transition. There, the one-sided derivatives diverge. Finally, we also briefly discuss the behaviour for the one-dimensional Ising chain in a magnetic field.

18.10.2019, 02:12