Search
Workshop

Geometry of Higher-Order Structures in Machine Learning

  • Jiayi Li
Lecture Hall Laboratoire de Mathématiques d’Orsay, Université Paris-Saclay (Paris)

Abstract

Neural networks defined by polynomial or rational activations, as well as architectures with higher-order connectivity, give rise to algebraic varieties whose geometry encodes both the optimization landscape and the generalization behavior of learned representations. Using tools from numerical algebraic geometry—including ideals, algebraic stratification, and the resolution of singularities—we study the critical loci and degeneracies that shape learning dynamics. This approach reveals how phenomena such as symmetry, collapse, and instability correspond to singular structures within parameter space. The results provide a language for describing the hidden geometric of modern learning systems.