Statistical Learning Theory
- Thomas Kahle (MPI MiS, Leipzig)
Abstract
A fundamental question of statistics first formulated and answered by Fischer is what must one know a priori about an unknown functional dependence in order to estimate it on the basis of observations.
Until the 1980's the answer to this was : "Almost everything", namely one must know the functional dependence up to values of a number of finite parameters. The research of V. Vapnik et.al. replaced this answer by a new paradigm that overcame the restrictions.
In this talk I will introduce the general learning problem as the minimization of a functional that depends on an unkown probability measure and describe the results of Vapniks research. In particular I will discuss the principle of empirical risk minimization, probabilistic bounds on the risk and how the generalization ability of a learning machine can be controled to optimize learning from small sample sizes.