Awesome-Graph-Prompt
A collection of AWESOME things about performing prompting on Graphs.
Recently, the workflow of "pre-train, fine-tune" has been proved less effective and efficient when dealing with diverse downstream tasks on graph domain. Inspired by the prompt learning in natural language processing (NLP) domain, the "pre-train, prompt" workflow has emerged as a promising solution.
This repo aims to provide a curated list of research papers that explore the prompting on graphs. It is based on our Survey Paper: Graph Prompt Learning: A Comprehensive Survey and Beyond. We will try to make this list updated frequently. If you found any error or any missed paper, please don't hesitate to open issues or pull requests.🌹
Table of Contents
- Awesome-Graph-Prompt
GNN Prompting Papers
Summary of existing representative works on graph prompt.
-
GPPT: Graph Pre-training and Prompt Tuning to Generalize Graph Neural Networks. In KDD'2022, [Paper] [Code].
-
SGL-PT: A Strong Graph Learner with Graph Prompt Tuning. In arXiv, [Paper].
-
GraphPrompt: Unifying Pre-Training and Downstream Tasks for Graph Neural Networks. In WWW'2023, [Paper] [Code].
-
All in One: Multi-Task Prompting for Graph Neural Networks. In KDD'2023 Best Paper Award 🌟, [Paper] [Code].
-
Virtual Node Tuning for Few-shot Node Classification. In KDD'2023, [Paper].
-
PRODIGY: Enabling In-context Learning Over Graphs. In NeurIPS'2023 Spotlight 🌟, [Paper] [Code].
-
Universal Prompt Tuning for Graph Neural Networks. In NeurIPS'2023, [Paper] [Code].
-
Deep Prompt Tuning for Graph Transformers. In arXiv, [Paper].
-
Prompt Tuning for Multi-View Graph Contrastive Learning. In arXiv, [Paper].
-
ULTRA-DP:Unifying Graph Pre-training with Multi-task Graph Dual Prompt. In arXiv, [Paper].
-
HetGPT: Harnessing the Power of Prompt Tuning in Pre-Trained Heterogeneous Graph Neural Networks. In arXiv, [Paper].
-
Enhancing Graph Neural Networks with Structure-Based Prompt. In arXiv, [Paper].
-
Generalized Graph Prompt: Toward a Unification of Pre-Training and Downstream Tasks on Graphs. In arXiv, [Paper] [Code].
-
HGPROMPT: Bridging Homogeneous and Heterogeneous Graphs for Few-shot Prompt Learning. In AAAI'2024, [Paper].
-
MultiGPrompt for Multi-Task Pre-Training and Prompting on Graphs. In arXiv, [Paper].
Multi-Modal Prompting with Graphs
Prompt in Text-Attributed Graphs
-
Augmenting Low-Resource Text Classification with Graph-Grounded Pre-training and Prompting. In SIGIR'2023, [Paper] [Code].
-
Prompt Tuning on Graph-augmented Low-resource Text Classification. In arXiv, [Paper] [Code].
-
Prompt-Based Zero- and Few-Shot Node Classification: A Multimodal Approach. In arXiv, [Paper].
-
Prompt-based Node Feature Extractor for Few-shot Learning on Text-Attributed Graphs. In arXiv, [Paper].
-
Large Language Models as Topological Structure Enhancers for Text-Attributed Graphs. In arXiv, [Paper].
Large Language Models in Graph Data Processing
For this research line, please refer to Awesome LLMs with Graph Tasks [Survey Paper | Github Repo]
We highly recommend this work as they have provided a comprehensive survey to summarize the works on the integration of LLM and Graph 👍
Multi-modal Fusion with Graph and Prompting
-
GraphAdapter: Tuning Vision-Language Models With Dual Knowledge Graph. In NeurIPS'2023, [Paper] [Code].
Graph+Text+Image
-
SynerGPT: In-Context Learning for Personalized Drug Synergy Prediction and Drug Design. In arXiv, [Paper].
Graph+Text
-
Which Modality should I use - Text, Motif, or Image? Understanding Graphs with Large Language Models. In arXiv, [Paper].
Graph+Text+Image
Graph Domain Adaptation with Prompting
-
GraphGLOW: Universal and Generalizable Structure Learning for Graph Neural Networks. In KDD'2023, [Paper] [Code].
-
GraphControl: Adding Conditional Control to Universal Graph Pre-trained Models for Graph Domain Transfer Learning. In arXiv, [Paper].
Application Papers
Social Networks
- Prompt-and-Align: Prompt-Based Social Alignment for Few-Shot Fake News Detection.
In CIKM'2023, [Paper] [Code].
Fake News Detection
- Voucher Abuse Detection with Prompt-based Fine-tuning on Graph Neural Networks.
In CIKM'2023, [Paper].
Fraud Detection
Recommender Systems
- Contrastive Graph Prompt-tuning for Cross-domain Recommendation.
In TOIS'2023, [Paper].
Cross-domain Recommendation
- An Empirical Study Towards Prompt-Tuning for Graph Contrastive Pre-Training in Recommendations.
In NeurIPS'2023, [Paper] [Code].
General Recommendation
- Motif-Based Prompt Learning for Universal Cross-Domain Recommendation.
In WSDM'2024, [Paper].
Cross-domain Recommendation
- Graph Pre-training and Prompt Learning for Recommendation.
In arXiv, [Paper].
General Recommendation
Knowledge Graph
- Structure Pretraining and Prompt Tuning for Knowledge Graph Transfer. In WWW'2023, [Paper] [Code].
- Graph Neural Prompting with Large Language Models. In arXiv, [Paper].
- Knowledge Graph Prompting for Multi-Document Question Answering. In arXiv, [Paper] [Code].
Biology
- Can Large Language Models Empower Molecular Property Prediction? In arXiv, [Paper] [Code].
- GIMLET: A Unified Graph-Text Model for Instruction-Based Molecule Zero-Shot Learning. In NeurIPS'2023, [Paper] [Code].
- MolCA: Molecular Graph-Language Modeling with Cross-Modal Projector and Uni-Modal Adapter. In EMNLP'2023, [Paper] [Code].
- ReLM: Leveraging Language Models for Enhanced Chemical Reaction Prediction. In EMNLP'2023, [Paper] [Code].
Others
- A Data-centric Framework to Endow Graph Neural Networks with Out-Of-Distribution Detection Ability.
In KDD'2023, [Paper] [Code].
OOD Detection
Other Resources
Open Source
-
ProG: A Unified Library for Graph Prompting [Website] [Code]
ProG (Prompt Graph) is a library built upon PyTorch to easily conduct single or multiple task prompting for a pre-trained Graph Neural Networks (GNNs).
Datasets
Datasets that are commonly used in GNN prompting papers.
Citation Networks
Dataset | #Node | #Edge | #Feature | #Class |
---|---|---|---|---|
Cora | 2708 | 5429 | 1433 | 7 |
CoraFull | 19793 | 63421 | 8710 | 70 |
Citeseer | 3327 | 4732 | 3703 | 6 |
DBLP | 17716 | 105734 | 1639 | 4 |
Pubmed | 19717 | 44338 | 500 | 3 |
Coauthor-CS | 18333 | 81894 | 6805 | 15 |
Coauthor-Physics | 34493 | 247962 | 8415 | 5 |
ogbn-arxiv | 169343 | 1166243 | 128 | 40 |
Purchase Networks
Dataset | #Node | #Edge | #Feature | #Class |
---|---|---|---|---|
Amazon-Computers | 13752 | 245861 | 767 | 10 |
Amazon-Photo | 7650 | 119081 | 745 | 8 |
ogbn-products | 2449029 | 61859140 | 100 | 47 |
Social Networks
Dataset | #Node | #Edge | #Feature | #Class |
---|---|---|---|---|
232965 | 11606919 | 602 | 41 | |
Flickr | 89250 | 899756 | 500 | 7 |
Molecular Graphs
Dataset | #Graph | #Node (Avg.) | #Edge (Avg.) | #Feature | #Class |
---|---|---|---|---|---|
COX2 | 467 | 41.22 | 43.45 | 3 | 2 |
ENZYMES | 600 | 32.63 | 62.14 | 18 | 6 |
MUTAG | 188 | 17.93 | 19.79 | 7 | 2 |
MUV | 93087 | 24.23 | 26.28 | - | 17 |
HIV | 41127 | 25.53 | 27.48 | - | 2 |
SIDER | 1427 | 33.64 | 35.36 | - | 27 |
Online Talks
- Official Presentation of All in One Link
Blogs
- A Chinese Blog on Graph Prompting (including GPPT, GraphPrompt, All in One, etc) [Link]
Contributing
👍 Contributions to this repository are welcome!
If you have come across relevant resources, feel free to open an issue or submit a pull request.
Citation
If you find our work helpful to you, please feel free to cite it:
@article{sun2023graph,
title = {Graph Prompt Learning: A Comprehensive Survey and Beyond},
author = {Sun, Xiangguo and Zhang, Jiawen and Wu, Xixi and Cheng, Hong and Xiong, Yun and Li, Jia},
year = {2023},
journal = {arXiv:2311.16534},
eprint = {2311.16534},
archiveprefix = {arxiv}
}
@inproceedings{sun2023all,
title={All in One: Multi-Task Prompting for Graph Neural Networks},
author={Sun, Xiangguo and Cheng, Hong and Li, Jia and Liu, Bo and Guan, Jihong},
booktitle={Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery \& data mining (KDD'23)},
year={2023},
pages = {2120–2131},
location = {Long Beach, CA, USA},
isbn = {9798400701030},
url = {https://doi.org/10.1145/3580305.3599256},
doi = {10.1145/3580305.3599256}
}