Search

MiS Preprint Repository

We have decided to discontinue the publication of preprints on our preprint server as of 1 March 2024. The publication culture within mathematics has changed so much due to the rise of repositories such as ArXiV (www.arxiv.org) that we are encouraging all institute members to make their preprints available there. An institute's repository in its previous form is, therefore, unnecessary. The preprints published to date will remain available here, but we will not add any new preprints here.

MiS Preprint
82/2009

Finding the Maximizers of the Information Divergence from an Exponential Family

Johannes Rauh

Abstract

This paper investigates maximizers of the information divergence from an exponential family $\mathcal{E}$. It is shown that the $rI$-projection of a maximizer $P$ to $\mathcal{E}$ is a convex combination of $P$ and a probability measure $P_{-}$ with disjoint support and the same value of the sufficient statistics $A$. This observation can be used to transform the original problem of maximizing $D(\cdot||\mathcal{E})$ over the set of all probability measures into the maximization of a function $\overline{D}_r$ over a convex subset of $\ker A$. The global maximizers of both problems correspond to each other. Furthermore, finding all local maximizers of $\overline{D}_r$ yields all local maximizers of $D(\cdot||\mathcal{E})$.

This paper also proposes two algorithms to find the maximizers of $\overline{D}_r$ and applies them to two examples, where the maximizers of $D(\cdot||\mathcal{E})$ were not known before.

Received:
Dec 23, 2009
Published:
Jan 6, 2010
MSC Codes:
94A17, 62B10, 52C40, 13P25
Keywords:
kullback-leibler divergence, relative entropy, exponential family, information projection, optimization, commutative algebra

Related publications

inJournal
2011 Repository Open Access
Johannes Rauh

Finding the maximizers of the information divergence from an exponential family

In: IEEE transactions on information theory, 57 (2011) 6, pp. 3236-3247