Complexity of Deciding Injectivity and Surjectivity of ReLU Neural Networks
- Moritz Grillo (TU Berlin)
Abstract
Neural networks with ReLU activation play a key role in modern machine learning. In view of safety-critical applications, the verification of trained networks is of great importance and necessitates a thorough understanding of essential properties of the function computed by a ReLU network, including characteristics like injectivity and surjectivity.
In this talk, I will present computational complexity results concerning the injectivity and surjectivity of ReLU networks. We show that deciding the injectivity of a ReLU layer is coNP-complete. On the positive side, we introduce a parameterized algorithm that establishes fixed-parameter tractability of the problem with respect to the input dimension. Additionally, we characterize surjectivity for two-layer ReLU networks with one-dimensional output and prove that the decision problem is NP-complete. Finally, we reveal a connection to computational convexity by reformulating the surjectivity problem as a zonotope containment problem.
This is joint work with Vincent Froese and Martin Skutella.