Search

Talk

The Geometry of Neural Nets' Parameter Spaces Under Reparametrization

  • Agustinus Kristiadi (Vector Institute)
Live Stream

Abstract

Model reparametrization, which follows the change-of-variable rule of calculus (not to be confused with weight-space symmetry), is a popular way to improve the training of neural nets, e.g. in WeightNorm. But it can also be problematic since it can induce inconsistencies in, e.g., Hessian-based flatness measures, optimization trajectories, and modes of probability densities. This complicates downstream analyses: e.g. one cannot definitively relate flatness with generalization since arbitrary reparametrization changes their relationship. In this talk, I will present a study of the invariance of neural nets under reparametrization from the perspective of Riemannian geometry. From this point of view, invariance is an inherent property of any neural net if one explicitly represents the metric and uses the correct associated transformation rules. This is important since although the metric is always present, it is often implicitly assumed as identity, and thus dropped from the notation, then lost under reparametrization. I will discuss implications for measuring the flatness of minima, optimization, and for probability-density maximization. As a bonus, I will also give a teaser of our other recent work in exploiting the geometry of preconditioning matrices to develop an inverse-free, structured KFAC-like second-order optimization method for very large, modern neural nets like transformers. The resulting method is thus numerically stable in low precision and also memory efficient.

Links

seminar
19.12.24

Math Machine Learning seminar MPI MIS + UCLA

MPI for Mathematics in the Sciences Live Stream

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail

Upcoming Events of this Seminar