Search

Talk

Analysis of the Geometric Structure of Neural Networks and Neural ODEs via Morse Functions

  • Sara-Viola Kuntz (Technical Universty of Munich)
Live Stream

Abstract

Besides classical feed-forward neural networks, neural ordinary differential equations (neural ODEs) have gained particular interest in recent years. Neural ODEs can be interpreted as an infinite depth limit of feed-forward or residual neural networks. In this presentation, we study the input-output dynamics of finite and infinite-depth neural networks with scalar output. In the finite depth case, the input maps under multiple non-linear transformations to the state of one output node. In analogy, a neural ODE maps a linear transformation of the input to a linear transformation of its time-T map. We show that depending on the structure of the network, the input-output map has different properties regarding the existence and regularity of critical points, which can be characterized via Morse functions. We prove that critical points cannot exist if the dimension of the hidden layer is monotonically decreasing or the dimension of the phase space is smaller or equal to the input dimension. If critical points exist, we classify their regularity depending on the specific architecture. The established theorems are comparable in the finite and infinite depth case and allow us to formulate results on universal embedding and approximation. Our dynamical systems viewpoint on the geometric structure of the input-output map provides a fundamental understanding of why specific architectures outperform others.

seminar
23.01.25 30.01.25

Math Machine Learning seminar MPI MIS + UCLA

MPI for Mathematics in the Sciences Live Stream

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail

Upcoming Events of this Seminar