Abstract for the talk on 27.04.2023 (17:00 h)Math Machine Learning seminar MPI MIS + UCLA
Stéphane d'Ascoli (ENS and FAIR Paris)
Double descent: insights from the random feature model
See the video of this talk.
See the slides of this talk.
In this talk I will present various insights on the double descent curve obtained by considering a solvable model for deep learning : the random feature model. First, I will present a fine-grained bias-variance decomposition and show how the double descent curve can be reconciled with the traditional bias-variance tradeoff. Then, I will show that two different kinds of overfitting, which are often conflated, can give rise to a “double descent” curve, and can actually occur simultaneously, leading to a triple descent curve. Finally, I will extend some of these findings to classification tasks on structured data, showing the impact of the loss function and the role of low-dimensional structures.