Skip to yearly menu bar Skip to main content


Poster

Causal Subgraphs and Information Bottlenecks: Redefining OOD Robustness in Graph Neural Networks

Weizhi An · Wenliang Zhong · Feng Jiang · Hehuan Ma · Junzhou Huang

Strong blind review: This paper was not made available on public preprint services during the review process Strong Double Blind
[ ]
Fri 4 Oct 1:30 a.m. PDT — 3:30 a.m. PDT

Abstract:

Graph Neural Networks (GNNs) are increasingly popular in processing graph-structured data, yet they face significant challenges when training and testing distributions diverge, common in real-world scenarios. This divergence often leads to substantial performance drops in GNN models. To address this, our study introduces a novel approach that effectively enhances GNN performance in Out-of-Distribution (OOD) scenarios. We propose a method CSIB guided by causal modeling principles to generate causal subgraphs, while concurrently consider both Fully Informative Invariant Features (FIIF) and Partially Informative Invariant Features (PIIF) situations. Our approach uniquely combines the principles of invariant risk minimization and graph information bottleneck. This integration not only guides the generation of causal subgraphs but also underscores the necessity of balancing invariant principles with information compression in the face of various distribution shifts. We validate our model through extensive experiments across diverse shift types, demonstrating its effectiveness in maintaining robust performance under OOD conditions.

Live content is unavailable. Log in and register to view live content