Preprint 82/2009

Finding the Maximizers of the Information Divergence from an Exponential Family

Johannes Rauh

Contact the author: Please use for correspondence this email.
Submission date: 23. Dec. 2009
Pages: 27
published in: IEEE transactions on information theory, 57 (2011) 6, p. 3236-3247 
DOI number (of the published article): 10.1109/TIT.2011.2136230
MSC-Numbers: 94A17, 62B10, 52C40, 13P25
Keywords and phrases: kullback-leibler divergence, relative entropy, exponential family, information projection, optimization, commutative algebra
Download full preprint: PDF (280 kB)

This paper investigates maximizers of the information divergence from an exponential family formula16. It is shown that the rI-projection of a maximizer P to formula16 is a convex combination of P and a probability measure formula26 with disjoint support and the same value of the sufficient statistics A. This observation can be used to transform the original problem of maximizing formula30 over the set of all probability measures into the maximization of a function formula32 over a convex subset of formula34. The global maximizers of both problems correspond to each other. Furthermore, finding all local maximizers of formula32 yields all local maximizers of formula30.

This paper also proposes two algorithms to find the maximizers of formula32 and applies them to two examples, where the maximizers of formula30 were not known before.

23.06.2018, 02:12