Search

Talk

Mathematics of Machine Learning

Live broadcast from University of Bielefeld U2-113 MPI for Mathematics in the Sciences / University of Leipzig (Leipzig)

Abstract

The lecture “Mathematics of Machine Learning” serves as an introduction to analyzing common numerical problems appearing in modern machine learning applications. We will particularly focus on supervised learning and the dynamics of stochastic gradient descent. This involves the analysis of stochastic processes in discrete time, whose behavior is closely linked to deterministic, as well as stochastic, differential equations. We derive convergence rates for the classical Robbins-Monro algorithm and its Ruppert-Polyak smoothing and analyze the effect of adding inertia (momentum) to the dynamical system. Other possible topics include (stable) central limits theorems, Multilevel Monte Carlo and reinforcement learning. While the general techniques for the asymptotic analysis of stochastic processes are also introduced, proper basic knowledge of probability theory (including martingale theory) is required.

Date and time info
Wednesday, 16-18

Keywords
Stochastic Gradient Descent, Stochastic Approximation, Ruppert-Polyak averaging, Supervised learning

Prerequisites
Basic knowledge of probability theory (including martingale theory)

Remarks and notes
This is a hybrid lecture with the possibility to participate online. It will be held at the University of Bielefeld.

lecture
01.10.23 31.01.24

Regular lectures Winter semester 2023-2024

MPI for Mathematics in the Sciences / University of Leipzig see the lecture detail pages

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail