Search
Talk

Neural Tangent Kernels: Data augmentation and Feynman diagrams

  • Jan Gerken (Chalmers University of Technology)
Live Stream

Abstract

In this talk, I will discuss how neural tangent kernels (NTKs) can be used to understand the symmetry properties of deep ensembles trained with data augmentation. In the infinite-width limit, we prove that such ensembles are equivariant at any training step, even off-manifold, and that the predictor becomes equivalent to a group convolutional neural network. This equivariance is emergent: individual ensemble members are not equivariant, but their collective prediction is. I will prove this theoretically using NTK theory and verify our insights with numerical experiments.

I will also discuss recent work on going beyond the infinite-width limit using Feynman diagrams. While infinite-width NTKs are analytically tractable, they miss important phenomena like NTK evolution and feature learning. I introduce a diagrammatic approach for computing finite-width corrections that dramatically simplifies the necessary calculations and enables calculation of neural network statistics at finite width.

seminar
12.02.26 26.03.26

Math Machine Learning seminar MPI MIS + UCLA Math Machine Learning seminar MPI MIS + UCLA

MPI for Mathematics in the Sciences Live Stream

Upcoming Events of this Seminar