Search
Talk

Activation degree thresholds and expressiveness of polynomial neural networks

  • Bella Lynne Finkel (University of Wisconsin–Madison)
Live Stream

Abstract

Polynomial neural networks are implemented in a range of applications and present an advantageous framework for theoretical machine learning. In this talk, we will discuss the expressive power of deep polynomial neural networks through the geometry of their neurovariety. In particular, we introduce the notion of the activation degree threshold of a network architecture to determine when the dimension of a neurovariety achieves its theoretical maximum. We show that activation degree thresholds of polynomial neural networks exist and provide an upper bound, resolving a conjecture of Kileel, Trager and Bruna on the dimension of neurovarieties associated to networks with high activation degree. Certain structured architectures have exceptional activation degree thresholds, making them especially expressive in the sense of their neurovariety dimension. In this direction, we discuss a proof that polynomial neural networks with equi-width architectures are maximally expressive. Along the way, we will see several illustrative examples motivated by applications.

seminar
20.11.25 11.12.25

Math Machine Learning seminar MPI MIS + UCLA Math Machine Learning seminar MPI MIS + UCLA

MPI for Mathematics in the Sciences Live Stream

Upcoming Events of this Seminar