Subgraph Pooling: Tackling Negative Transfer on Graphs
Subgraph Pooling: Tackling Negative Transfer on Graphs
Zehong Wang, Zheyuan Zhang, Chuxu Zhang, Yanfang Ye
Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence
Main Track. Pages 5153-5161.
https://doi.org/10.24963/ijcai.2024/570
Transfer learning aims to enhance performance on a target task by using knowledge from related tasks. However, when the source and target tasks are not closely aligned, it can lead to reduced performance, known as negative transfer. Unlike in image or text data, we find that negative transfer could commonly occur in graph-structured data, even when source and target graphs have semantic similarities. Specifically, we identify that structural differences significantly amplify the dissimilarities in the node embeddings across graphs. To mitigate this, we bring a new insight in this paper: for semantically similar graphs, although structural differences lead to significant distribution shift in node embeddings, their impact on subgraph embeddings could be marginal. Building on this insight, we introduce Subgraph Pooling (SP) by aggregating nodes sampled from a k-hop neighborhood and Subgraph Pooling++ (SP++) by a random walk, to mitigate the impact of graph structural differences on knowledge transfer. We theoretically analyze the role of SP in reducing graph discrepancy and conduct extensive experiments to evaluate its superiority under various settings. The proposed SP methods are effective yet elegant, which can be easily applied on top of any backbone Graph Neural Networks (GNNs). Our code and data are available at: https://github.com/Zehong-Wang/Subgraph-Pooling.
Keywords:
Machine Learning: ML: Sequence and graph learning
Data Mining: DM: Mining graphs
Machine Learning: ML: Multi-task and transfer learning
Machine Learning: ML: Semi-supervised learning