Home

Awesome

Accelerating Large Scale Real-Time GNN Inference using Channel Pruning

Dependencies

Dataset

We use the same data format as GraphSAINT, all dataset used could be downloaded on Google Drive link and put into the data folder in the root directory. For the Arxiv dataset, please use the undirected ogbn-arxiv_undirected version instead of the directed ogbn-arxiv version.

Model Parameters

All the parameters used in training, pruning and re-training are stored in .yml files in /train_config/fullbatch and /train_config/minibatch. The fullbatch and minibatch folders include the parameters on all datasets for full inference and batched inference, respectively. For each dataset and each inference type, we provide three pruning budgets, 2x, 4x and 8x. Each .yml file includes five sections: 1. network specifying GNN architecture 2. params parameters in the GNN network 3. phase parameters to train the original model 4. prune parameters to prune the trained model 5. retrain_phase parameters to retrain the pruned model and one optional section 6. batch_inference parameters in batched inference. The first three sections use the same format as GraphSAINT (GraphSAINT: Graph Sampling Based Inductive Learning Method, Zeng et al, 2020). The detailed information on the entries in the other sections are as follows.

Run

We have two Cython modules that need to be compiled before running. To compile the modules, run the following from the root directory:

python GNN/setup.py build_ext --inplace
python GNN/pytorch_version/setup.py build_ext --inplace

To run the code

python -m GNN.pytorch_version.train --data_perfix <path-to-dataset-folder> --train_config <path-to-config-file>

We have also set up some useful flags

Citation

Here's the bibtex in case you want to cite our work.

@article{10.14778/3461535.3461547,
  author = {Zhou, Hongkuan and Srivastava, Ajitesh and Zeng, Hanqing and Kannan, Rajgopal and Prasanna, Viktor},
  title = {Accelerating Large Scale Real-Time GNN Inference Using Channel Pruning},
  year = {2021},
  issue_date = {May 2021},
  publisher = {VLDB Endowment},
  volume = {14},
  number = {9},
  issn = {2150-8097},
  url = {https://doi.org/10.14778/3461535.3461547},
  doi = {10.14778/3461535.3461547},
  journal = {Proc. VLDB Endow.},
  month = {may},
  pages = {1597–1605},
  numpages = {9}
}