

Preprint 24/2022
Oversquashing in GNNs through the lens of information contraction and graph expansion
Pradeep Kumar Banerjee, Kedar Karhadkar, Yu Guang Wang, Uri Alon, and Guido Montúfar
Contact the author: Please use for correspondence this email.
Submission date: 09. Aug. 2022
Bibtex
Keywords and phrases: graph neural networks, oversquashing, expander graphs, Information Bottleneck, discrete curvature, SDPI
Link to arXiv: See the arXiv entry of this preprint.
Abstract:
The quality of signal propagation in message-passing graph neural networks (GNNs) strongly influences their expressivity as has been observed in recent works. In particular, for prediction tasks relying on long-range interactions, recursive aggregation of node features can lead to an undesired phenomenon called``oversquashing''. We present a framework for analyzing oversquashing based on information contraction. Our analysis is guided by a model of reliable computation due to von Neumann that lends a new insight into oversquashing as signal quenching in noisy computation graphs. Building on this, we propose a graph rewiring algorithm aimed at alleviating oversquashing. Our algorithm employs a random local edge flip primitive motivated by an expander graph construction. We compare the spectral expansion properties of our algorithm with that of an existing curvature-based non-local rewiring strategy. Synthetic experiments show that while our algorithm in general has a slower rate of expansion, it is overall computationally cheaper, preserves the node degrees exactly and never disconnects the graph.