Talk

Complexity of Deciding Injectivity and Surjectivity of ReLU Neural Networks

  • Moritz Grillo (TU Berlin)
Live Stream

Abstract

Neural networks with ReLU activation play a key role in modern machine learning. In view of safety-critical applications, the verification of trained networks is of great importance and necessitates a thorough understanding of essential properties of the function computed by a ReLU network, including characteristics like injectivity and surjectivity.

In this talk, I will present computational complexity results concerning the injectivity and surjectivity of ReLU networks. We show that deciding the injectivity of a ReLU layer is coNP-complete. On the positive side, we introduce a parameterized algorithm that establishes fixed-parameter tractability of the problem with respect to the input dimension. Additionally, we characterize surjectivity for two-layer ReLU networks with one-dimensional output and prove that the decision problem is NP-complete. Finally, we reveal a connection to computational convexity by reformulating the surjectivity problem as a zonotope containment problem.

This is joint work with Vincent Froese and Martin Skutella.

seminar
12.06.25 02.10.25

Math Machine Learning seminar MPI MIS + UCLA Math Machine Learning seminar MPI MIS + UCLA

MPI for Mathematics in the Sciences Live Stream

Upcoming Events of this Seminar