Search

Talk

Topological obstruction to the training of shallow ReLU neural networks

  • Francesco Vaccarino (Politecnico di Torino)
Live Stream

Abstract

Studying the interplay between the geometry of the loss landscape and the optimisation trajectories of simple neural networks is a fundamental step for understanding their behaviour in more complex settings. We discuss the presence of a topological obstruction in the loss landscape of shallow ReLU neural networks trained using gradient flow. The homogeneous nature of the ReLU activation function constrains the training trajectories to lie on a product of quadric hypersurfaces whose shape depends on the particular initialisation of the network's parameters. We prove that these quadrics can have multiple connected components, limiting the set of reachable parameters during training. We analytically compute the number of these components and discuss the possibility of mapping one to the other through neuron rescaling and permutation.

seminar
13.02.25

Math Machine Learning seminar MPI MIS + UCLA

MPI for Mathematics in the Sciences Live Stream

Upcoming Events of this Seminar