Talk
Generalization Theory of Linearized Neural Networks
- David Holzmüller (INRIA Paris)
Abstract
Despite the great practical success of deep learning, our theoretical understanding of the generalization of neural networks is still limited. Theoretical analysis can be facilitated by studying "linearized" neural networks, for example through neural tangent kernel theory. While this allows to leverage some classical theory for regularized kernel methods, only more recent theoretical results have enabled to study phenomena such as benign overfitting and double descent. In this talk, I will give an overview of my research on the generalization of neural network regression as well as some related works.