Talk

Upper and lower bounds for gradient based sampling methods

  • Niladri S. Chatterji (UC Berkeley)
Live Stream

Abstract

This talk will talk about two results regarding gradient-based sampling methods.

First, I will present upper bounds for the problem of sampling from a distribution p(x)exp(U(x)), where the potential function U is smooth everywhere and strongly convex outside a ball of radius R, but potentially nonconvex inside this ball. We study both overdamped and underdamped Langevin MCMC and establish upper bounds on the number of steps required to obtain a sample from a distribution that is within ϵ of the target distribution p in 1-Wasserstein distance.

In the second part of the talk, I will talk about our recent work that establishes information theoretic lower bounds on the iteration complexity of stochastic gradient-based algorithms for sampling from strongly-log concave densities.

This is joint work with Yasin Abbasi-Yadkori, Peter Bartlett, Xiang Cheng, Michael Jordan and Philip Long.

Links

seminar
22.05.25 05.06.25

Math Machine Learning seminar MPI MIS + UCLA Math Machine Learning seminar MPI MIS + UCLA

MPI for Mathematics in the Sciences Live Stream

Upcoming Events of this Seminar