Relation between Jaynes estimation and Fisher information
- Raymond Streater (King's College London, London, United Kingdom)
Jaynes suggested that given random variables $(X_1, \dots , X_n)$ of unknown distribution, and measurements of them, then the best estimate for their joint distribution is given by maximising the entropy of all states, under the condition that they predict that the mean of each, in a distribution, be its exact mean. We show that this can be proved to be the best estimate, in that the scores, $X_i - \langle X_i \rangle$ have the least joint variance under this condition. The result is extended to quantum mechanics: the estimation of n self-adjoint quadratic forms that are Kato-small relative to a given positive self-adjoint operator.