Abstract for the talk on 14.07.2022 (17:00 h)

Math Machine Learning seminar MPI MIS + UCLA

Rong Ge (Duke University)
Towards Understanding Training Dynamics for Mildly Overparametrized Models
See the video of this talk.

While over-parameterization is widely believed to be crucial for the success of optimization for the neural networks, most existing theories on over-parameterization do not fully explain the reason – they either work in the Neural Tangent Kernel regime where neurons don’t move much, or require an enormous number of neurons. In this talk I will describe our recent works towards understanding training dynamics that go beyond kernel regimes with only polynomially many neurons (mildly overparametrized). In particular, we first give a local convergence result for mildly overparametrized two-layer networks. We then analyze the global training dynamics for a related overparametrized tensor model. For both works, we rely on a key intuition that neurons in overparametrized models work in groups and it’s important to understand the behavior of an average neuron in the group. Based on two works: arxiv.org/abs/2102.02410 and arxiv.org/abs/2106.06573


02.12.2022, 10:37