Home

Awesome

<p align="center"> <img width = "700" height = "300" src="https://github.com/EdisonLeeeee/GraphGallery/blob/master/imgs/graphgallery.svg" alt="banner"/> <br/> </p> <p align="center"><strong><em>PyTorch</em> is all you need!</strong></p> <p align=center> <a href="https://www.python.org/downloads/release/python-360/"> <img src="https://img.shields.io/badge/Python->=3.6-3776AB?logo=python" alt="Python"> </a> <!-- <a href="https://github.com/tensorflow/tensorflow/releases/tag/v2.1.0"> <img src="https://img.shields.io/badge/TensorFlow->=2.1.0-FF6F00?logo=tensorflow" alt="tensorflow"> </a> --> <a href="https://github.com/pytorch/pytorch"> <img src="https://img.shields.io/badge/PyTorch->=1.4-FF6F00?logo=pytorch" alt="pytorch"> </a> <a href="https://pypi.org/project/graphgallery/"> <img src="https://badge.fury.io/py/graphgallery.svg" alt="pypi"> </a> <a href="https://github.com/EdisonLeeeee/GraphGallery/blob/master/LICENSE"> <img src="https://img.shields.io/github/license/EdisonLeeeee/GraphGallery" alt="license"> </a> </p>

GraphGallery

GraphGallery is a gallery for benchmarking Graph Neural Networks (GNNs) based on pure PyTorch backend. Alteratively, Pytorch Geometric (PyG) and Deep Graph Library (DGL) backend are also available in GraphGallery to facilitate your implementations.

💨 NEWS

🚀 Installation

Please make sure you have installed PyTorch. Also, Pytorch Geometric (PyG) and Deep Graph Library (DGL) are alternative choices.

Install from source:

# Recommended
git clone https://github.com/EdisonLeeeee/GraphGallery.git && cd GraphGallery
pip install -e . --verbose

where -e means "editable" mode so you don't have to reinstall every time you make changes.

NOTE: GraphGallery is a frequently updated package and DO NOT install GraphGallery with pip, we're currently working on releasing a binary distribution on PyPI, stay tuned!

🤖 Implementations

In detail, the following methods are currently implemented:

Node Classification

MethodAuthorPaperPyTorchPyGDGL
ChebyNetMichaël Defferrard et al.Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering (NeurIPS'16):heavy_check_mark:
GCNThomas N. Kipf et al.Semi-Supervised Classification with Graph Convolutional Networks (ICLR'17):heavy_check_mark::heavy_check_mark::heavy_check_mark:
GraphSAGEWilliam L. Hamilton et al.Inductive Representation Learning on Large Graphs (NeurIPS'17):heavy_check_mark::heavy_check_mark:
FastGCNJie Chen et al.FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling (ICLR'18):heavy_check_mark:
GATPetar Veličković et al.Graph Attention Networks (ICLR'18):heavy_check_mark::heavy_check_mark::heavy_check_mark:
SGCFelix Wu et al.Simplifying Graph Convolutional Networks (ICLR'19):heavy_check_mark::heavy_check_mark::heavy_check_mark:
GWNNBingbing Xu et al.Graph Wavelet Neural Network (ICLR'19):heavy_check_mark:
ClusterGCNWei-Lin Chiang et al.Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks (KDD'19):heavy_check_mark:
DAGNNMeng Liu et al.Towards Deeper Graph Neural Networks (KDD'20):heavy_check_mark::heavy_check_mark:
GDCJohannes Klicpera et al.Diffusion Improves Graph Learning (NeurIPS'19):heavy_check_mark:
TAGCNJian Du et al.Topology Adaptive Graph Convolutional Networks (arxiv'17):heavy_check_mark:
APPNP, PPNPJohannes Klicpera et al.Predict then Propagate: Graph Neural Networks meet Personalized PageRank (ICLR'19):heavy_check_mark::heavy_check_mark:
PDNBenedek Rozemberczki et al.Pathfinder Discovery Networks for Neural Message Passing (ICLR'21):heavy_check_mark:
SSGCZhu et al.Simple Spectral Graph Convolution (ICLR'21):heavy_check_mark:
AGNNKiran K. Thekumparampil al.Attention-based Graph Neural Network for semi-supervised learning (ICLR'18 openreview):heavy_check_mark:
ARMABianchi et al.Graph Neural Networks with convolutional ARMA filters (Arxiv'19)
GraphMLPYang Hu et al.Graph-MLP: Node Classification without Message Passing in Graph (Arxiv'21):heavy_check_mark:
LGC, EGC, hLGCLuca Pasa et al.Simple Graph Convolutional Networks (Arxiv'21):heavy_check_mark:
GRANDWenzheng Feng et al.Graph Random Neural Network for Semi-Supervised Learning on Graphs (NeurIPS'20):heavy_check_mark:
AlaGCN, AlaGATYiqing Xie et al.When Do GNNs Work: Understanding and Improving Neighborhood Aggregation (IJCAI'20):heavy_check_mark:
JKNetKeyulu Xu et al.Representation Learning on Graphs with Jumping Knowledge Networks (ICML'18):heavy_check_mark:
MixHopSami Abu-El-Haija et al.MixHop: Higher-Order Graph Convolutional Architecturesvia Sparsified Neighborhood Mixing (ICML'19):heavy_check_mark:
DropEdgeYu Rong et al.DropEdge: Towards Deep Graph Convolutional Networks on Node Classification (ICML'20):heavy_check_mark:
Node2GridsDalong Yang et al.Node2Grids: A Cost-Efficient Uncoupled Training Framework for Large-Scale Graph Learning (CIKM'21):heavy_check_mark:
RobustGCNDingyuan Zhu et al.Robust Graph Convolutional Networks Against Adversarial Attacks (KDD'19):heavy_check_mark::heavy_check_mark:
SBVAT, OBVATZhijie Deng et al.Batch Virtual Adversarial Training for Graph Convolutional Networks (ICML'19):heavy_check_mark:
SimPGCNWei Jin et al.Node Similarity Preserving Graph Convolutional Networks (WSDM'21):heavy_check_mark:
GraphVATFuli Feng et al.Graph Adversarial Training: Dynamically Regularizing Based on Graph Structure (TKDE'19):heavy_check_mark:
LATGCNHongwei Jin et al.Latent Adversarial Training of Graph Convolution Networks (ICML@LRGSD'19):heavy_check_mark:
DGATWeibo Hu et al.Robust graph convolutional networks with directional graph adversarial training (Applied Intelligence'19):heavy_check_mark:
MedianGCN, TrimmedGCNLiang Chen et al.Understanding Structural Vulnerability in Graph Convolutional Networks:heavy_check_mark::heavy_check_mark::heavy_check_mark:

Graph Purification

The graph purification methods are universal for all models, just specify:

graph_transform="purification_method"

so, here we only give the examples of GCN with purification methods, other models should work.

MethodAuthorPaper
GCN-JaccardHuijun Wu et al.Adversarial Examples on Graph Data: Deep Insights into Attack and Defense (IJCAI'19)
GCN-SVDNegin Entezari et al.All You Need Is Low (Rank): Defending Against Adversarial Attacks on Graphs (WSDM'20)

LinkPrediction

MethodAuthorPaperPyTorchPyGDGL
GAE, VGAEThomas N. Kipf et al.Variational Graph Auto-Encoders (NeuIPS'16):heavy_check_mark::heavy_check_mark:

Node Embedding

The following methods are framework-agnostic.

MethodAuthorPaper
DeepwalkBryan Perozzi et al.DeepWalk: Online Learning of Social Representations (KDD'14)
Node2vecAditya Grover and Jure Leskovecnode2vec: Scalable Feature Learning for Networks (KDD'16)
Node2vec+Renming Liu et al.Accurately Modeling Biased Random Walks on Weighted Graphs Using Node2vec+
BANEHong Yang et al.Binarized attributed network embedding (ICDM'18)

⚡ Quick Start

Datasets

you can simply run dataset.available_datasets() to see the available datasets, e.g.,:

from graphgallery.datasets import Planetoid
print(Planetoid.available_datasets())

more details please refer to GraphData.

Example of GCN (Node Classification Task)

It takes just a few lines of code.

import torch
import graphgallery
from graphgallery.datasets import Planetoid
from graphgallery.gallery import callbacks

data = Planetoid('cora', root="~/GraphData/datasets/", verbose=True)
graph = data.graph
splits = data.split_nodes()
device = torch.device('cuda') if torch.cuda.is_available() else torch.device('cpu')

graphgallery.set_backend("torch")
from graphgallery.gallery.nodeclas import GCN

trainer = GCN(device=device, seed=123).setup_graph(graph, feat_transform="normalize_feat").build()
cb = callbacks.ModelCheckpoint('model.pth', monitor='val_accuracy')
trainer.fit(splits.train_nodes, splits.val_nodes, verbose=1, callbacks=[cb])
results = trainer.evaluate(splits.test_nodes)
print(f'Test loss {results.loss:.5}, Test accuracy {results.accuracy:.2%}')

Example of GAE (Link Prediction Task)

import torch
import graphgallery
from graphgallery.gallery import callbacks
from graphgallery.datasets import Planetoid

data = Planetoid('cora', root="~/GraphData/datasets/", verbose=True)
graph = data.graph
splits = data.split_edges(random_state=15)
device = torch.device('cuda') if torch.cuda.is_available() else torch.device('cpu')

graphgallery.set_backend("torch")

from graphgallery.gallery.linkpred import GAE
trainer = GAE(device=device, seed=123).setup_graph(graph).build()
cb = callbacks.ModelCheckpoint('model.pth', monitor='val_ap')
trainer.fit(splits.train_pos_edge_index,
            val_data=(splits.val_pos_edge_index, splits.val_neg_edge_index), 
            verbose=1, callbacks=[cb])
results = trainer.evaluate((splits.test_pos_edge_index, splits.test_neg_edge_index))
print(results)

If you have any troubles, you can simply run trainer.help() for more information.

Other Backends

>>> import graphgallery
# Default: PyTorch backend
>>> graphgallery.backend()
PyTorch 1.9.0+cu111 Backend
# Switch to PyTorch Geometric backend
>>> graphgallery.set_backend("pyg")
# Switch to DGL PyTorch backend
>>> graphgallery.set_backend("dgl")
# Switch to PyTorch backend
>>> graphgallery.set_backend("th") # "torch", "pytorch"

But your codes don't even need to change.

❓ How to add your datasets

This is motivated by gnn-benchmark

from graphgallery.data import Graph

# Load the adjacency matrix A, attribute (feature) matrix X and labels vector y
# A - scipy.sparse.csr_matrix of shape [num_nodes, num_nodes]
# X - scipy.sparse.csr_matrix or numpy.ndarray of shape [num_nodes, num_feats]
# y - numpy.ndarray of shape [num_nodes]

mydataset = Graph(adj_matrix=A, attr_matrix=X, label=y)
# save dataset
mydataset.to_npz('path/to/mydataset.npz')
# load dataset
mydataset = Graph.from_npz('path/to/mydataset.npz')

⭐ Road Map

❓ FAQ

Please fell free to contact me if you have any troubles.

😘 Acknowledgement

This project is motivated by Pytorch Geometric, Stellargraph and DGL, etc., and the original implementations of the authors, thanks for their excellent works!

Cite

Please cite our paper (and the respective papers of the methods used) if you use this code in your own work:

@inproceedings{li2021graphgallery,
author = {Jintang Li and Kun Xu and Liang Chen and Zibin Zheng and Xiao Liu},
booktitle = {2021 IEEE/ACM 43rd International Conference on Software Engineering: Companion Proceedings (ICSE-Companion)},
title = {GraphGallery: A Platform for Fast Benchmarking and Easy Development of Graph Neural Networks Based Intelligent Software},
year = {2021},
pages = {13-16},
publisher = {IEEE Computer Society},
address = {Los Alamitos, CA, USA},
}