Abstract for the talk on 12.05.2022 (17:00 h)

Math Machine Learning seminar MPI MIS + UCLA

Zixiang Chen (UCLA)
Benign Overfitting in Two-layer Convolutional Neural Networks
See the video of this talk.

Modern neural networks often have great expressive power and can be trained to overfit the training data, while still achieving a good test performance. This phenomenon is referred to as “benign overfitting”. Recently, there emerged a line of works studying “benign overfitting” from the theoretical perspective. However, they are limited to linear models or kernel/random feature models, and there is still a lack of theoretical understanding about when and how benign overfitting occurs in neural networks.

In this talk, I will give an answer to the above question. We precisely characterize the conditions under which benign overfitting can occur in training two-layer convolutional neural networks. We show that when the signal-to-noise ratio satisfies a certain condition, a two-layer CNN trained by gradient descent can achieve arbitrarily small training and test loss. On the other hand, when this condition does not hold, overfitting becomes harmful and the obtained CNN can only achieve a constant level of test loss. These together demonstrate a sharp phase transition between benign overfitting and harmful overfitting, driven by the signal-to-noise ratio.

This talk is based on joint work with Yuan Cao, Mikhail Belkin, and Quanquan Gu.


10.06.2022, 13:08