The article presents Graph-Mamba, a model developed to improve long-range context modeling in graph networks. Traditional attention mechanisms used in Graph Transformers often fail to scale in large graphs due to their quadratic computational cost. The presented model, Graph-Mamba, addresses this issue by integrating a Mamba block with the input-dependent node selection mechanism. This approach enhances context-aware reasoning and improves predictive performance while reducing computational cost and memory consumption. The study demonstrates that Graph-Mamba outperforms other methods in long-range graph prediction tasks.

 

Publication date: 2 Feb 2024
Project Page: https://github.com/bowang-lab/Graph-Mamba
Paper: https://arxiv.org/pdf/2402.00789