Awesome
Graph Condensation Papers
<img src="https://img.shields.io/badge/Contributions-Welcome-278ea5" alt="Contrib"/>
Graph condensation (GC) is a data-centric approach that accelerates GNN model training by creating a compact yet representative graph to replace the original graph. It enables GNNs trained on the condensed graph to match the performance of those trained on the original graph.
<p align="center"> <img src="main.jpg" alt="GC" width="750"> </p>This repository aims to provide a comprehensive resource for researchers and practitioners interested in exploring various aspects of graph condensation.
For a detailed overview of graph condensation techniques and their applications, we recommend reading our survey paper: 🔥Graph Condensation: A Survey. This survey paper serves as an excellent starting point for understanding the fundamentals of graph condensation and exploring its diverse applications.
Latest Updates
[27/11/2024] Contrastive Graph Condensation: Advancing Data Versatility through Self-Supervised Learning (Xinyi Gao et al. Arxiv'24)
[05/09/2024] GSTAM: Efficient Graph Distillation with Structural Attention-Matching (Arash Rasti-Meymandi et al. ECCV'24)
[28/08/2024] Self-Supervised Learning for Graph Dataset Condensation (Yuxiang Wang et al. KDD'24)
[31/07/2024] Backdoor Graph Condensation (Jiahao Wu et al. Arxiv'24)
[20/07/2024] TinyGraph: Joint Feature and Node Condensation for Graph Neural Networks (Yezi Liu et al. Arxiv'24)
Contribution
We welcome contributions to enhance the breadth and depth of this repository. If you have a paper related to graph condensation that you believe should be included, please feel free to submit a pull request. Together, we can build a valuable resource for the graph condensation community.
| conference/journal'year | [paper_name](paper_link) | Authors | [[code]](code_link) |
Contents
The repository is organized into categories to facilitate easy navigation and exploration of papers related to graph condensation, including effectiveness, efficiency, generalization, fairness and applications.
- Graph Condensation Papers
Survey
Arxiv'24 | Graph Condensation: A Survey | Xinyi Gao et al. |
IJCAI'24 | A Comprehensive Survey on Graph Reduction: Sparsification, Coarsening, and Condensation | Mohammad Hashemi & Wei Jin et al. |
Arxiv'24 | A Survey on Graph Condensation | Hongjia Xu et al. |
Â
Methodology
Effective Graph Condensation
Efficient Graph Condensation
KDD'22 | DosCond | Condensing Graphs via One-Step Gradient Matching | Wei Jin et al. | [code] |
Arxiv'22 | GCDM | Graph Condensation via Receptive Field Distribution Matching | Mengyang Liu et al. | |
KDD'23 | KIDD | Kernel Ridge Regression-Based Graph Dataset Distillation | Zhe Xu et al. | [code] |
WWW'24 | GC-SNTK | Fast Graph Condensation with Structure-based Neural Tangent Kernel | Lin Wang et al. | |
ICLR'24 | Mirage | Mirage: Model-Agnostic Graph Distillation for Graph Classification | Mridul Gupta et al. | [code] |
Arxiv'24 | DisCo | Disentangled Condensation for Large-scale Graphs | Zhenbang Xiao et al. | [code] |
WWW'24 | EXGC | EXGC: Bridging Efficiency and Explainability in Graph Condensation | Junfeng Fang et al. | [code] |
Arxiv'24 | SimGC | Simple Graph Condensation | Zhenbang Xiao et al. | [code] |
Arxiv'24 | CGC | Rethinking and Accelerating Graph Condensation: A Training-Free Approach with Class Partition | Xinyi Gao et al. |
Generalized Graph Condensation
NeurIPS'23 | SGDD | Does Graph Distillation See Like Vision Dataset Counterpart? | Beining Yang et al. | [code] |
ICML'24 | GDEM | Graph Distillation with Eigenbasis Matching | Yang Liu et al. | |
KDD'24 | OpenGC | Graph Condensation for Open-World Graph Learning | Xinyi Gao et al. | |
Arxiv'24 | CTGC | Contrastive Graph Condensation: Advancing Data Versatility through Self-Supervised Learning | Xinyi Gao et al. |
Fair Graph Condensation
NeurIPS'23 | FGD | Fair Graph Distillation | Qizhang Feng et al. |
AS'23 | GCARe | GCARe: Mitigating Subgroup Unfairness in Graph Condensation through Adversarial Regularization | Runze Mao et al. |
Robust Graph Condensation
Arxiv'24 | RobGC | RobGC: Towards Robust Graph Condensation | Xinyi Gao et al. |
Â
Applications
Graph Continual Learning
ICDM'23 | CaT | CaT: Balanced Continual Graph Learning with Graph Condensation | Yilun Liu et al. | [code] |
Arxiv'23 | PUMA | PUMA: Efficient Continual Graph Learning with Graph Condensation | Yilun Liu et al. | [code] |
Hyper-Parameter/Neural Architecture Search
Arxiv'23 | HCDC | Faster Hyperparameter Search for GNNs via Calibrated Dataset Condensation | Mucong Ding et al. |
Federated Learning
Arxiv'23 | FedGKD | FedGKD: Unleashing the Power of Collaboration in Federated Graph Neural Networks | Qiying Pan et al. |
Arxiv'24 | FedGC | Federated Graph Condensation with Information Bottleneck Principles | Bo Yan |
Inference Acceleration
ICDE'24 | MCond | Graph Condensation for Inductive Node Representation Learning | Xinyi Gao et al. |
Heterogeneous Graph
TKDE'24 | HGCond | Heterogeneous Graph Condensation | Jian Gao et al. | [code] |
Backdoor Attack
Arxiv'24 | BGC | Backdoor Graph Condensation | Jiahao Wu et al. |
Â
Open-Source Libraries
Library | Paper | Implementation | #GC Methods | #Datasets | Tasks |
---|---|---|---|---|---|
GCondenser | [paper] | PyG, DGL | 6 | 7 | Node classification |
GC-Bench | [paper] | PyG | 9 | 12 | Node classification, graph classification, link prediction, node clustering, anomaly detection |
GraphSlim | [paper] | PyG | 7 | 5 | Node classification |
Â
Related Repositories
In addition to this Graph Condensation Papers Repository, you may find the following related repositories valuable for your research and exploration:
Â
Contact
For any inquiries or suggestions regarding this repository, please don't hesitate to contact us by opening an issue on this repository.
Thank you for your interest in the Graph Condensation Papers Repository. We hope you find it valuable for your research and exploration. If you find this repository to be useful, please cite our survey paper.
@article{gao2024graph,
title={Graph condensation: A survey},
author={Gao, Xinyi and Yu, Junliang and Chen, Tong and Ye, Guanhua and Zhang, Wentao and Yin, Hongzhi},
journal={arXiv preprint arXiv:2401.11720},
year={2024}
}