Search

Talk

Generalization Theory of Linearized Neural Networks

  • David Holzmüller (INRIA Paris)
Live Stream

Abstract

Despite the great practical success of deep learning, our theoretical understanding of the generalization of neural networks is still limited. Theoretical analysis can be facilitated by studying "linearized" neural networks, for example through neural tangent kernel theory. While this allows to leverage some classical theory for regularized kernel methods, only more recent theoretical results have enabled to study phenomena such as benign overfitting and double descent. In this talk, I will give an overview of my research on the generalization of neural network regression as well as some related works.

Links

seminar
19.12.24

Math Machine Learning seminar MPI MIS + UCLA

MPI for Mathematics in the Sciences Live Stream

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail

Upcoming Events of this Seminar