We have decided to discontinue the publication of preprints on our preprint server as of 1 March 2024. The publication culture within mathematics has changed so much due to the rise of repositories such as ArXiV (www.arxiv.org) that we are encouraging all institute members to make their preprints available there. An institute's repository in its previous form is, therefore, unnecessary. The preprints published to date will remain available here, but we will not add any new preprints here.
MiS Preprint
46/2003
On maximization of the information divergence from an exponential family
František Matúš and Nihat Ay
Abstract
The information divergence of a probability measure $P$ from an exponential family $\mathcal{E}$ over a finite set is defined as infimum of the divergences of $P$ from $Q$ subject to $Q$ in $\mathcal{E}$. For convex exponential families the local maximizers of this function of $P$ are found. General exponential family $\mathcal{E}$ of dimension $d$ is enlarged to an exponential family $\mathcal{E}^*$ of the dimension at most $3d+2$ such that the local maximizers are of zero divergence from $\mathcal{E}^*$.