GPPT: Graph Pre-training and Prompt Tuning to Generalize Graph Neural Networks
2022年KDD刚中的文章,GNN领域第一篇用prompt实现self-supervised learning的文章。
Motivation
They rarely notice the inherent training objective gap between the pretext and downstream tasks. This significant gap often requires costly fine-tuning for adapting the pre-trained model to downstream problem, which prevents the efficient elicitation of pre-trained knowledge and then results in poor results
Contribution
To bridge the task gap, we propose a novel transfer learning paradigm to generalize GNNs, namely graph pre-training and prompt tuning (GPPT).