Search
Talk

Maxout polytopes

  • Shelby Cox (MPI MiS, Leipzig)
Live Stream

Abstract

Maxout polytopes are defined by feedforward neural networks with 2-maxout activation and non-negative weights after the first layer. Fixing the number of nodes in each layer and varying the network weights yields a family of maxout polytopes. I will discuss the parameter spaces and extremal f-vectors for these families of maxout polytopes. I will also show that when the network has no bottlenecks, the generic maxout polytopes are cubical. A key construction is the separating hypersurface of two normally equivalent polytopes, which arises when a layer is added to the network.

This talk is based on the preprint "Maxout polytopes", which is joint work with Andrei Balakin, Georg Loho and Bernd Sturmfels.

seminar
11.12.25 26.02.26

Math Machine Learning seminar MPI MIS + UCLA Math Machine Learning seminar MPI MIS + UCLA

MPI for Mathematics in the Sciences Live Stream

Upcoming Events of this Seminar