Exponential families are natural statistical models. In physics they are used since their elements maximize the entropy subject to constrained expectation values of a fixed set of associated observables. An important subclass are the graphical and hierarchical (log linear) models that are used to model interactions between different random variables. They also appear in information geometry and algebraic statistics due to their nice structural properties.
The information distance from an exponential family has an interpretation as information loss through a projection onto that family. Mutual information, conditional mutual information and multi-information allow for such a geometric interpretation. In this project we analyze the maximization of the distance from exponential families. This problem is motivated by principles of information maximization known from theoretical neuroscience. The project aims at identifying natural models of learning systems that are consistent with information maximization and, at the same time, display high generalization ability. In this context, topological closures of exponential families turn out to be essential. Geometrically they are equivalent to polytopes and display a rich combinatorial structure.