Search

Workshop

On Oversquashing in Message Passing Graph Neural Networks

  • Pradeep Kr. Banerjee (MPI MiS)
G3 10 (Lecture hall)

Abstract

The quality of signal propagation in message passing graph neural networks strongly influences their expressivity. In particular, for prediction tasks relying on long-range interactions, recursive aggregation of node features can lead to an undesired phenomenon called "oversquashing". We present a framework for analyzing oversquashing and propose a graph rewiring algorithm aimed at alleviating the same. Our algorithm employs a random local edge flip primitive motivated by an expander graph construction. We compare the spectral expansion properties of our algorithm with that of an existing curvature-based non-local rewiring strategy. Synthetic experiments show that while our algorithm in general has a slower rate of expansion, it is overall computationally cheaper, preserves the node degrees exactly and never disconnects the graph.

Katharina Matschke

Max Planck Institute for Mathematics in the Sciences Contact via Mail

Guido Montúfar

Max Planck Institute for Mathematics in the Sciences

Pradeep Kr. Banerjee

Max Planck Institute for Mathematics in the Sciences

Kedar Karhadkar

Max Planck Institute for Mathematics in the Sciences