

Zusammenfassung für den Vortrag am 03.09.2020 (17:00 Uhr)
Math Machine Learning seminar MPI MIS + UCLANiladri S. Chatterji (UC Berkeley)
Upper and lower bounds for gradient based sampling methods
Siehe auch das Video dieses Vortrages.
This talk will talk about two results regarding gradient-based sampling methods.
First, I will present upper bounds for the problem of sampling from a distribution \(p^*(x) \propto \exp(−U(x))\), where the potential function \(U\) is smooth everywhere and strongly convex outside a ball of radius \(R\), but potentially nonconvex inside this ball. We study both overdamped and underdamped Langevin MCMC and establish upper bounds on the number of steps required to obtain a sample from a distribution that is within \(\epsilon\) of the target distribution \(p^∗\) in 1-Wasserstein distance.
In the second part of the talk, I will talk about our recent work that establishes information theoretic lower bounds on the iteration complexity of stochastic gradient-based algorithms for sampling from strongly-log concave densities.
This is joint work with Yasin Abbasi-Yadkori, Peter Bartlett, Xiang Cheng, Michael Jordan and Philip Long.