Search

Talk

Understanding the Algorithmic Regularization due to Dropout

  • Poorya Mianjy (Johns Hopkins University)
Live Stream

Abstract

Algorithmic regularization provides deep learning models with capacity control that helps them generalize. In this talk, we focus on understanding such capacity control due to dropout training in various machine learning models including deep linear networks, matrix sensing, and two-layer ReLU networks. In particular, by characterizing the regularizer induced by dropout training, we give concrete generalization error bounds for the dropout training in these models.

Links

seminar
5/2/24 5/16/24

Math Machine Learning seminar MPI MIS + UCLA

MPI for Mathematics in the Sciences Live Stream

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail

Upcoming Events of This Seminar