Search
Talk

Understanding Neural Network Expressivity via Polyhedral Geometry

  • Christoph Hertrich (UTN Nuremberg)
Live Stream

Abstract

Neural networks with rectified linear unit (ReLU) activations are one of the standard models in modern machine learning. Despite their practical importance, fundamental theoretical questions concerning ReLU networks remain open until today. For instance, what is the precise set of (piecewise linear) functions representable by ReLU networks with a given depth? And what functions can we represent with polynomial-size neural networks? In this talk I will report about recent progress towards resolving such questions using techniques from polyhedral geometry and combinatorial optimization. This talk is based on arxiv.org/abs/2505.14338, arxiv.org/abs/2411.03006, and related papers / preprints.no abstract available

seminar
11.12.25 26.02.26

Math Machine Learning seminar MPI MIS + UCLA Math Machine Learning seminar MPI MIS + UCLA

MPI for Mathematics in the Sciences Live Stream

Upcoming Events of this Seminar