Search

Talk

The manifold of low energy states in neural networks: structure and barrier to algorithms

  • Enrico Malatesta (Bocconi University)
Live Stream

Abstract

The characterization of the structure of the manifold of low-energy lying states in neural networks is among the most fundamental theoretical questions in machine learning. In recent years, many empirical studies on the landscape of neural networks and constraint satisfaction problems (CSP) have shown that the low-lying configurations are often found in complex connected structures, where zero-energy paths between pairs of distant solutions can be constructed.

In this talk, I will discuss the geometrical organization of solutions in two prototypical problems, the "binary" and the "negative" perceptron. I will show that wide flat minima arise as complex extensive structures from the coalescence of minima around ``high-margin'' (i.e. locally robust) configurations. In the binary perceptron, in turn, the constraint density at which a hard phase for algorithms emerges can be predicted by the disappearance of the widest and flattest minima.

Finally, I will introduce a novel analytical method for characterizing the typical energy barriers between two configurations sampled from the zero-temperature Gibbs measure of the problem. This approach will provide a detailed characterization of the shape of the solution space of the negative perceptron and one-hidden layer neural networks in the overparameterized regime.

seminar
07.11.24 19.12.24

Math Machine Learning seminar MPI MIS + UCLA

MPI for Mathematics in the Sciences Live Stream

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail

Upcoming Events of this Seminar