Towards graph foundation models: from language modeling to graph modeling
- Dexiong Chen (Max-Planck-Institut for Biochemistry)
Abstract
Graph learning has witnessed rapid progress in recent years. While message passing neural networks (MPNNs) remain dominant, Graph Transformers (GTs) have emerged as promising alternatives, offering global context and scalability—traits that have driven breakthroughs in language modeling. In this talk, I will revisit a representative Graph Transformer model from 2022 and discuss why it is unlikely to be the final solution for graph representation learning. I will then introduce alternative approaches, particularly random walk–based models, that address key limitations of both MPNNs and GTs. Building on recent advances in language modeling, I will show how to capture long-range dependencies within random walks to construct more expressive models. Finally, I will present how random walks can be leveraged for graph generation, illustrating how graphs can be modeled analogously to language through reversible transformations between graphs and random sequences. This approach opens new possibilities for graph foundation models that could transform how we understand and generate complex graph-structured data.