Search

Talk

Upper and lower bounds for gradient based sampling methods

  • Niladri S. Chatterji (UC Berkeley)
Live Stream

Abstract

This talk will talk about two results regarding gradient-based sampling methods.

First, I will present upper bounds for the problem of sampling from a distribution $p^*(x) \propto \exp(−U(x))$, where the potential function $U$ is smooth everywhere and strongly convex outside a ball of radius $R$, but potentially nonconvex inside this ball. We study both overdamped and underdamped Langevin MCMC and establish upper bounds on the number of steps required to obtain a sample from a distribution that is within $\epsilon$ of the target distribution $p^∗$ in 1-Wasserstein distance.

In the second part of the talk, I will talk about our recent work that establishes information theoretic lower bounds on the iteration complexity of stochastic gradient-based algorithms for sampling from strongly-log concave densities.

This is joint work with Yasin Abbasi-Yadkori, Peter Bartlett, Xiang Cheng, Michael Jordan and Philip Long.

Links

seminar
5/2/24 5/16/24

Math Machine Learning seminar MPI MIS + UCLA

MPI for Mathematics in the Sciences Live Stream

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail

Upcoming Events of This Seminar