All in One: Multi-task Prompting for Graph Neural Networks (Extended Abstract)

All in One: Multi-task Prompting for Graph Neural Networks (Extended Abstract)

Xiangguo Sun, Hong Cheng, Jia Li, Bo Liu, Jihong Guan

Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence
Sister Conferences Best Papers. Pages 8460-8465. https://doi.org/10.24963/ijcai.2024/942

This paper is an extended abstract of our original work published in KDD23, where we won the best research paper award. The paper introduces a novel approach to bridging the gap between pre-trained graph models and the diverse tasks they’re applied to, inspired by the success of prompt learning in NLP. Recognizing the challenge of aligning pre-trained models with varied graph tasks (node level, edge level, and graph level), which can lead to negative transfer and poor performance, we propose a multi-task prompting method for graphs. This method involves unifying graph and language prompt formats, enabling NLP’s prompting strategies to be adapted for graph tasks. By analyzing the task space of graph applications, we reformulate problems to fit graph-level tasks and apply meta-learning to improve prompt initialization for multiple tasks. Experiments show our method’s effectiveness in enhancing model performance across different graph tasks. Beyond the original work, in this extended abstract, we further discuss the graph prompt from a bigger picture and provide some of the latest work toward this area.
Keywords:
Data Mining: DM: Mining graphs
Data Mining: DM: Networks
Machine Learning: ML: Representation learning
Multidisciplinary Topics and Applications: MTA: Web and social networks