Algebraic Geometry of Rational Neural Networks
- Maksym Zubkov (Berkeley)
Abstract
Rational neural networks are feedforward neural networks with rational activation functions. Rational neural networks naturally arise in applications including solving partial differential equations and performing regression/generative modeling (thanks to their smoothness and enhanced approximation power). In this talk, I will focus on the simplest case, with activation σ(x) = 1/x, and explore the expressivity of such architectures: what families of functions they can represent. We will define the neuromanifold (the functional space) and its neurovariety (the Zariski closure of the neuromanifold), and determine all architectures for shallow and binary deep networks that fill the entire ambient space. Finally, I will present a general closed formula for any rational neural network, along with computational results showing how such networks yield interpretable weights that indicate the locations of the linear poles of meromorphic functions.