Statistical Learning Theory

  • Thomas Kahle (MPI MiS, Leipzig)
A3 01 (Sophus-Lie room)


A fundamental question of statistics first formulated and answered by Fischer is what must one know a priori about an unknown functional dependence in order to estimate it on the basis of observations.

Until the 1980's the answer to this was : "Almost everything", namely one must know the functional dependence up to values of a number of finite parameters. The research of V. Vapnik replaced this answer by a new paradigm that overcame the restrictions.

In this talk I will introduce the general learning problem as the minimization of a functional that depends on an unkown probability measure and describe the results of Vapniks research. In particular I will discuss the principle of empirical risk minimization, probabilistic bounds on the risk and how the generalization ability of a learning machine can be controled to optimize learning from small sample sizes.

31.10.05 30.07.09

Seminar Statistical Mechanics

Universität Leipzig Raum 01/22

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail