Search

MiS Preprint Repository

We have decided to discontinue the publication of preprints on our preprint server as of 1 March 2024. The publication culture within mathematics has changed so much due to the rise of repositories such as ArXiV (www.arxiv.org) that we are encouraging all institute members to make their preprints available there. An institute's repository in its previous form is, therefore, unnecessary. The preprints published to date will remain available here, but we will not add any new preprints here.

MiS Preprint
46/2003

On maximization of the information divergence from an exponential family

František Matúš and Nihat Ay

Abstract

The information divergence of a probability measure $P$ from an exponential family $\mathcal{E}$ over a finite set is defined as infimum of the divergences of $P$ from $Q$ subject to $Q$ in $\mathcal{E}$. For convex exponential families the local maximizers of this function of $P$ are found. General exponential family $\mathcal{E}$ of dimension $d$ is enlarged to an exponential family $\mathcal{E}^*$ of the dimension at most $3d+2$ such that the local maximizers are of zero divergence from $\mathcal{E}^*$.

Received:
May 17, 2003
Published:
May 17, 2003
MSC Codes:
94A17, 62B10, 60A10
Keywords:
kullback-leibler divergence, information projection, exponential family, infomax principle

Related publications

inBook
2003 Repository Open Access
František Matúš and Nihat Ay

On maximization of the information divergence from an exponential family

In: Proceedings of 6th workshop on uncertainty processing : Hejnice, September 24-27, 2003
[Praha] : Oeconomica, 2003. - pp. 199-204