Home

Awesome

We provide the code (in pytorch) and datasets for our paper "GraphPrompt: Unifying Pre-Training and Downstream Tasks for Graph Neural Networks", which is accepted by WWW2023.

We Further extend GraphPrompt to GraphPrompt+ by enhancing the pre-training and prompting stages "Generalized Graph Prompt: Toward a Unification of Pre-Training and Downstream Tasks on Graphs" which is accepted by IEEE TKDE, the code and datasets are publicly available (https://github.com/gmcmt/graph_prompt_extension).

Description

The repository is organised as follows:

Package Dependencies

Running experiments

Graph Classification

Default dataset is ENZYMES. You need to change the corresponding parameters in pre_train.py and prompt_fewshot.py to train and evaluate on other datasets.

Pretrain:

Prompt tune and test:

Node Classification

Default dataset is ENZYMES. You need to change the corresponding parameters in prompt_fewshot.py to train and evaluate on other datasets.

Prompt tune and test:

Citation

@inproceedings{liu2023graphprompt,
title={GraphPrompt: Unifying Pre-Training and Downstream Tasks for Graph Neural Networks},
author={Liu, Zemin and Yu, Xingtong and Fang, Yuan and Zhang, Xinming},
booktitle={Proceedings of the ACM Web Conference 2023},
year={2023}
}