Search

Talk

Algebraic Complexity and Neurovariety of Linear Convolutional Networks

  • Vahid Shahverdi (KTH)
Live Stream

Abstract

Linear networks are artificial neural networks that use linear activation functions. Despite their simplicity, these networks have the potential for understanding more complex architectures. In this talk, we focus on linear convolutional networks with arbitrary strides. The neuromanifold of such a network is a semialgebraic set, represented by a space of polynomials that admit specific factorizations. We introduce a recursive algorithm to derive polynomial equations whose common zeros define the Zariski closure of the neuromanifold. Additionally, we examine the algebraic complexity involved in training these networks using techniques from metric algebraic geometry. We show that the total number of complex critical points in optimizing these networks corresponds to the generic Euclidean distance degree of a Segre variety. This number is notably higher than the number of critical points found when training a fully connected linear network with the same number of parameters.

seminar
01.08.24 22.08.24

Math Machine Learning seminar MPI MIS + UCLA

MPI for Mathematics in the Sciences Live Stream

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail

Upcoming Events of this Seminar