Home

Awesome

CPF

The official code of WWW2021 paper: Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework

PWC

PWC

PWC

PWC

PWC

PWC

PWC

PWC

PWC

PWC

Getting Started

Requirements

Usage

Quick start

  1. use python train_dgl.py --dataset=XXX --teacher=XXX to run teacher model.
  2. use python spawn_worker.py --dataset=XXX --teacher=XXX to run student model, we provide our hyper-parameters setting as reported in our paper, and an AutoML version for hyper-parameters search. (Our code supports Optuna to search best hyper-parameters for knowledge distillation. You can use --automl to run Optuna code.)

Add your own datasets

You can add your own datasets to folder data, the formats should accord to DGL requirements.

Add your own models

You can add your own teacher or student model by adding them into folder models, and following the format of model run.

Results

There are some results on GCN teacher model, with different datasets and student varients. More results can be seen in our paper.

DatasetsGCN (Teacher)CPF-ind (Student)CPF-tra (Student)improvement
Cora0.82440.85760.85674.0%
Citeseer0.71100.76190.76527.6%
Pubmed0.78040.80800.81043.8%
A-Computers0.83180.84430.84431.5%
A-Photo0.90720.93170.92482.7%

Benchmark Rankings

There are results use several models run on different benchmark datasets. Our experiments settings are available in the following form and the pwc.conf.yaml file. For simple usage, please try AutoML for hyper-parameters search.

Note:

BenchmarkModelAcclayeremb_dimfeat_dropattn_droplrwd
PWCCPF-tra-GCNII84.1%6160.20.55e-31e-2
PWCCPF-tra-APPNP80.26%8320.20.25e-35e-4
PWCCPF-tra-GCNII84.18%980.50.85e-31e-2
PWCCPF-ind-APPNP80.24%8160.80.25e-31e-2
PWCCPF-ind-APPNP77.3%7320.80.21e-31e-3
PWCCPF-ind-GAT85.5%8160.20.51e-31e-2
PWCCPF-ind-GAT94.1%9320.50.51e-21e-2
PWCCPF-ind-APPNP85.3%10640.80.85e-35e-4
PWCCPF-ind-APPNP74.6%6640.50.55e-31e-2
PWCCPF-tra-GCNII83.2%8160.80.81e-25e-4

Cite

Please cite our paper if you use this code in your own work:

@inproceedings{yang2021extract,
  title={Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework},
  author={Cheng Yang and Jiawei Liu and Chuan Shi},
  booktitle={Proceedings of The Web Conference 2021 (WWW ’21)},
  publisher={ACM},
  year={2021}
}

Contact Us

Please open an issue or contact Liu_Jiawei@bupt.edu.cn with any questions.