

Preprint 82/2009
Finding the Maximizers of the Information Divergence from an Exponential Family
Johannes Rauh
Contact the author: Please use for correspondence this email.
Submission date: 23. Dec. 2009
Pages: 27
published in: IEEE transactions on information theory, 57 (2011) 6, p. 3236-3247
DOI number (of the published article): 10.1109/TIT.2011.2136230
Bibtex
MSC-Numbers: 94A17, 62B10, 52C40, 13P25
Keywords and phrases: kullback-leibler divergence, relative entropy, exponential family, information projection, optimization, commutative algebra
Download full preprint: PDF (280 kB)
Abstract:
This paper investigates maximizers of the information divergence from an exponential family . It is shown that the rI-projection of a maximizer P to
is a convex combination of P and a probability measure
with disjoint support and the same value of the sufficient statistics A. This observation can be used to transform the original problem of maximizing
over the set of all probability measures into the maximization of a function
over a convex subset of
. The global maximizers of both problems correspond to each other. Furthermore, finding all local maximizers of
yields all local maximizers of
.
This paper also proposes two algorithms to find the maximizers of and applies them to two examples, where the maximizers of
were not known before.