$U$-entropy and maximum entropy model
- Shinto Eguchi (Institute of Statistical Mathematics, Japan)
Abstract
Boltzmann-Shannon entropy leads to an exponential model as the maximum entropy model with the constraint to the space of pdfs under which expectations of a given statistic $t(x)$ become a common vector. The maximum likelihood estimator for the expectation parameter of $t(x)$ under the exponential model is characterized by specific properties such as the attainment the Cramer-Rao bound. Any generator function $U$ defines $U$-entropy and $U$-divergence from the assumption of convexity of $U$. In this framework, $U$-entropy leads to $U$-model as the maximum entropy model under which the minimum $U$-divergence estimator for the expectation parameter is characterized by a structure of orthogonal foliation. If $U(s) = \exp(s)$, then this reduces to the case of Boltzmann-Shannon entropy. Surprisingly, we observe that the minimum $U$-divergence estimator under the $U$-model has a unique form, that is, the sample mean of $t(x)$. Alternatively if the minimum $U$-divergence estimator is employed under another $U$-model, then the estimator has a different form with the weighted mean of $t(x)$ over the sample. This talk discusses information geometric understandings for this aspect with Pythagoras identity, minimax game and robustness.