Nihat Ay proposed the following problem [1], motivated from statistical learning theory: Let be an exponential family. Find the maximizer of the Kullback-Leibler distance from . A maximizing probability measure has a lot of interesting properties. For example, the restriction of to the support of will be equal to , i.e. if (for the proof in the most general case see [2]). This simple property can be used to transform the problem into another form. The first observation is that probability measures having this "projection property" always come in pairs , such that and have the same sufficient statistics and disjoint supports. Therefore we can solve the original problem by investigating the kernel of the sufficient statistics . If we find all local maximizers of subject to , then we know all maximizers of the original problem. The talk will present the transformed problem and its relation to the original problem. In the end I will give some consequences for the solutions of the original problem.
[1] N. Ay: An Information-Geometric Approach to a Theory of Pragmatic Structuring. The Annals of Probability 30 (2002) 416-436.
[2] F. Matúš: Optimality conditions for maximizers of the information divergence from an exponential family. Kybernetika 43, 731-746.