On maximization of the information divergence from an exponential family
Frantisek Matús and Nihat Ay
Contact the author: Please use for correspondence this email.
Submission date: 17. May. 2003
published in: Proceedings of 6th workshop on uncertainty processing : Hejnice, September 24-27, 2003
[Praha] : Oeconomica, 2003. - P. 199 - 204
MSC-Numbers: 94A17, 62B10, 60A10
Keywords and phrases: kullback-leibler divergence, information projection, exponential family, infomax principle
Download full preprint: PDF (238 kB), PS ziped (157 kB)
The information divergence of a probability measure P from an exponential family over a finite set is defined as infimum of the divergences of P from Q subject to Q in . For convex exponential families the local maximizers of this function of P are found. General exponential family of dimension d is enlarged to an exponential family of the dimension at most 3d+2 such that the local maximizers are of zero divergence from .