Home

Awesome

A Systematic Survey of Chemical Pre-trained Models [IJCAI 2023]

Awesome GitHub stars GitHub forks

<!-- ![visitors](https://visitor-badge.glitch.me/badge?page_id=junxia97.awesome-pretrain-on-graphs) -->

This is a repository to help all readers who are interested in pre-training on molecules. If you find there are other resources with this topic missing, feel free to let us know via github issues, pull requests or my email: xiajun@westlake.edu.cn. We will update this repository and paper on a regular basis to maintain up-to-date.

Last update date: 2023-6-17

Contents

<a name="papers"></a>

Papers List

<a name="prestrategies"></a>

Pretraining Strategies

  1. [ArXiv 2023] Empowering Molecule Discovery for Molecule-Caption Translation with Large Language Models: A ChatGPT Perspective[Paper]
  2. [ECML-PKDD 2023]CoSP: Co-supervised pretraining of pocket and ligand[paper]
  3. [ChemRXiv]Learning chemical intuition from humans in the loop[paper]
  4. [Nature Machine Intelligence]Knowledge graph-enhanced molecular contrastive learning with functional prompt[paper]
  5. [IJCAI 2023]A Systematic Survey of Chemical Pre-trained Models[paper][code]
  6. [ICML 2023]A Group Symmetric Stochastic Differential Equation Model for Molecule Multi-modal Pretraining[paper][code]
  7. [Arxiv 2023]SELFormer: Molecular Representation Learning via SELFIES Language Models[paper]
  8. [Nature Machine Intelligence]Regression Transformer enables concurrent sequence regression and generation for molecular language modelling[paper]
  9. [Digital Discovery]Chemical representation learning for toxicity prediction[paper]
  10. [ArXiv 2023]Denoise Pre-training on Non-equilibrium Molecules for Accurate and Transferable Neural Potentials[paper]
  11. [JPCM]Extracting Predictive Representations from Hundreds of Millions of Molecules[paper]
  12. [ArXiv 2023]MOLECULAR PROPERTY PREDICTION BY SEMANTIC-INVARIANT CONTRASTIVE LEARNING[paper]
  13. [ArXiv 2023]Enhancing Activity Prediction Models in Drug Discovery with the Ability to Understand Human Language[Paper]
  14. [ArXiv 2023]Multi-modal Molecule Structure-text Model for Text-based Retrieval and Editing[paper]
  15. [ChemRXiv]Is GPT-3 all you need for low-data discovery in chemistry?[paper][Code]
  16. [ICLR 2023]Mole-BERT: Rethinking Pre-training Graph Neural Networks for Molecules[paper][code]
  17. [ICLR 2023]Molecular Geometry Pretraining with SE(3)-Invariant Denoising Distance Matching[paper][code]
  18. [ICLR 2023]Pre-training via Denoising for Molecular Property Prediction[paper][code]
  19. [ICLR 2023]Retrieval-based Controllable Molecule Generation[paper]
  20. [Research 2022]Pushing the Boundaries of Molecular Property Prediction for Drug Discovery with Multitask Learning BERT Enhanced by SMILES Enumeration[paper]
  21. [Briefings in Bioinformatics]MG-BERT: leveraging unsupervised atomic representation learning for molecular property prediction[paper][code]
  22. [ArXiv 2023]Drug Synergistic Combinations Predictions via Large-Scale Pre-Training and Graph Structure Learning[paper]
  23. [JMGM 2023]MolRoPE-BERT: An enhanced molecular representation with Rotary Position Embedding for molecular property prediction[Paper][Code]
  24. [Nature Machine Intelligence 2022]Accurate prediction of molecular properties and drug targets using a self-supervised image representation learning framework.[Paper][Code]
  25. [AAAI 2023]Energy-motivated equivariant pretraining for 3d molecular graphs[Paper][Code]
  26. [ArXiv 2023]Molecular Language Model as Multi-task Generator[Paper][Code]
  27. [Openreview 2022]MolBART: Generative Masked Language Models for Molecular Representations[Paper][Code]
  28. [KDD 2022]KPGT:knowledge-guided pre-training of graph transformer for molecular property prediction.[Paper][Code]
  29. [EMNLP 2022]Translation between Molecules and Natural Language[Paper][Code]
  30. [JCIM]MolGPT: Molecular Generation Using a Transformer-Decoder Model[Paper]
  31. [Bioinformatics]MICER: a pre-trained encoder–decoder architecture for molecular image captioning[Paper]
  32. [ECCV 2022] Generative Subgraph Contrast for Self-Supervised Graph Representation Learning[Paper]
  33. [ArXiv] Analyzing Data-Centric Properties for Contrastive Learning on Graphs[Paper]
  34. [ArXiv] Generative Subgraph Contrast for Self-Supervised Graph Representation Learning[Paper]
  35. [Bioinformatics]Multidrug Representation Learning Based on Pretraining Model and Molecular Graph for Drug Interaction and Combination Prediction[Paper]
  36. [BioArXiv]PanGu Drug Model: Learn a Molecule Like a Human[paper]
  37. [ChemRxiv]Uni-Mol: A Universal 3D Molecular Representation Learning Framework[paper]
  38. [ICML 2022]ProGCL: Rethinking Hard Negative Mining in Graph Contrastive Learning[paper][code]
  39. [ICML 2022]Let Invariant Rationale Discovery Inspire Graph Contrastive Learning[paper]
  40. [Ai4Science@ICML 2022]Pre-training Graph Neural Networks for Molecular Representations: Retrospect and Prospect[paper]
  41. [TNNLS 2022]CLEAR: Cluster-Enhanced Contrast for Self-Supervised Graph Representation Learning[paper]
  42. [Information Science]A new self-supervised task on graphs: Geodesic distance prediction[paper]
  43. [ArXiv 2022]Hard Negative Sampling Strategies for Contrastive Representation Learning[paper]
  44. [ArXiv 2022]Rethinking and Scaling Up Graph Contrastive Learning: An Extremely Efficient Approach with Group Discrimination[paper]
  45. [ArXiv 2022]KPGT: Knowledge-Guided Pre-training of Graph Transformer for Molecular Property Prediction[paper]
  46. [ArXiv 2022]COSTA: Covariance-Preserving Feature Augmentation for Graph Contrastive Learning[paper]
  47. [ArXiv 2022]Evaluating Self-Supervised Learning for Molecular Graph Embeddings[paper]
  48. [ArXiv 2022]I'm Me, We're Us, and I'm Us: Tri-directional Contrastive Learning on Hypergraphs[paper]
  49. [ArXiv 2022]Triangular Contrastive Learning on Molecular Graphs[paper]
  50. [ArXiv 2022]ImGCL: Revisiting Graph Contrastive Learning on Imbalanced Node Classification[paper]
  51. [KDD 2022]GraphMAE: Self-Supervised Masked Graph Autoencoders[paper]
  52. [ArXiv 2022]MaskGAE: Masked Graph Modeling Meets Graph Autoencoders[paper]
  53. [TNSE 2022]Deep Multi-Attributed-View Graph Representation Learning[paper]
  54. [TSIPN 2022]Fair Contrastive Learning on Graphs[paper]
  55. [Easychair]Cross-Perspective Graph Contrastive Learning[paper]
  56. [WWW 2022 Workshop] A Content-First Benchmark for Self-Supervised Graph Representation Learning [paper]
  57. [Arxiv 2022] SCGC: Self-Supervised Contrastive Graph Clustering [paper]
  58. [ICASSP 2022] Graph Fine-Grained Contrastive Representation Learning [paper]
  59. [TCYB 2022] Multiview Deep Graph Infomax to Achieve Unsupervised Graph Embedding [paper]
  60. [Arxiv 2022] Augmentation-Free Graph Contrastive Learning [paper]
  61. [Arxiv 2022] A Simple Yet Effective Pretraining Strategy for Graph Few-shot Learning [paper]
  62. [Arxiv 2022] Unsupervised Heterophilous Network Embedding via r-Ego Network Discrimination [paper]
  63. [CVPR 2022] Node Representation Learning in Graph via Node-to-Neighbourhood Mutual Information Maximization[paper]
  64. [Arxiv 2022] GraphCoCo: Graph Complementary Contrastive Learning[paper]
  65. [AAAI 2022] Simple Unsupervised Graph Representation Learning[paper]
  66. [SDM 2022] Neural Graph Matching for Pre-training Graph Neural Networks[paper]
  67. [Nature Machine Intelligence 2022] Molecular contrastive learning of representations via graph neural networks [paper]
  68. [WWW 2022] SimGRACE: A Simple Framework for Graph Contrastive Learning without Data Augmentation [paper] [code]
  69. [WWW 2022] Rumor Detection on Social Media with Graph Adversarial Contrastive Learning[paper]
  70. [WWW 2022] Robust Self-Supervised Structural Graph Neural Network for Social Network Prediction[paper]
  71. [WWW 2022] Dual Space Graph Contrastive Learning [paper]
  72. [WWW 2022] Adversarial Graph Contrastive Learning with Information Regularization [paper]
  73. [WWW 2022] The Role of Augmentations in Graph Contrastive Learning: Current Methodological Flaws & Improved Practices [paper]
  74. [WWW 2022] ClusterSCL: Cluster-Aware Supervised Contrastive Learning on Graphs [paper]
  75. [WWW 2022] Graph Communal Contrastive Learning [paper]
  76. [TKDE 2022] CCGL: Contrastive Cascade Graph Learning [paper][code]
  77. [BIBM 2021] Molecular Graph Contrastive Learning with Parameterized Explainable Augmentations [paper]
  78. [WSDM 2022]Bringing Your Own View: Graph Contrastive Learning without Prefabricated Data Augmentations [paper] [code]
  79. [SDM 2022] Structure-Enhanced Heterogeneous Graph Contrastive Learning [paper]
  80. [AAAI 2022] GeomGCL: Geometric Graph Contrastive Learning for Molecular Property Prediction [paper]
  81. [AAAI 2022] Self-supervised Graph Neural Networks via Diverse and Interactive Message Passing [paper]
  82. [AAAI 2022] Augmentation-Free Self-Supervised Learning on Graphs [paper][code]
  83. [AAAI 2022] Deep Graph Clustering via Dual Correlation Reduction [paper][code]
  84. [ICOIN 2022] Adaptive Self-Supervised Graph Representation Learning [paper]
  85. [arXiv 2022] Graph Masked Autoencoder [paper]
  86. [arXiv 2022] Structural and Semantic Contrastive Learning for Self-supervised Node Representation Learning [paper]
  87. [arXiv 2022] Graph Self-supervised Learning with Accurate Discrepancy Learning [paper]
  88. [arXiv 2021] Multilayer Graph Contrastive Clustering Network [paper]
  89. [arXiv 2021] Graph Representation Learning via Contrasting Cluster Assignments [paper]
  90. [arXiv 2021] Graph-wise Common Latent Factor Extraction for Unsupervised Graph Representation Learning [paper]
  91. [arXiv 2021] Bayesian Graph Contrastive Learning [paper]
  92. [NeurIPS 2021 Workshop] Self-Supervised GNN that Jointly Learns to Augment [paper]
  93. [NeurIPS 2021] Enhancing Hyperbolic Graph Embeddings via Contrastive Learning [paper]
  94. [NeurIPS 2021] Transfer Learning of Graph Neural Networks with Ego-graph Information Maximization [paper]
  95. [NeurIPS 2021] Motif-based Graph Self-Supervised Learning for Molecular Property Prediction [paper]
  96. [NeurIPS 2021] Graph Adversarial Self-Supervised Learning [paper]
  97. [NeurIPS 2021] Contrastive laplacian eigenmaps [paper]
  98. [NeurIPS 2021] Directed Graph Contrastive Learning [paper][code]
  99. [NeurIPS 2021] Multi-view Contrastive Graph Clustering [paper][code]
  100. [NeurIPS 2021] From Canonical Correlation Analysis to Self-supervised Graph Neural Networks [paper][code]
  101. [NeurIPS 2021] InfoGCL: Information-Aware Graph Contrastive Learning [paper]
  102. [NeurIPS 2021] Adversarial Graph Augmentation to Improve Graph Contrastive Learning [paper][code]
  103. [NeurIPS 2021] Disentangled Contrastive Learning on Graphs [paper]
  104. [arXiv 2021] Subgraph Contrastive Link Representation Learning [paper]
  105. [arXiv 2021] Augmentations in Graph Contrastive Learning: Current Methodological Flaws & Towards Better Practices [paper]
  106. [arXiv 2021] Collaborative Graph Contrastive Learning: Data Augmentation Composition May Not be Necessary for Graph Representation Learning [paper]
  107. [CIKM 2021] Contrastive Pre-Training of GNNs on Heterogeneous Graphs [paper]
  108. [CIKM 2021] Self-supervised Representation Learning on Dynamic Graphs [paper]
  109. [CIKM 2021] SGCL: Contrastive Representation Learning for Signed Graphs [paper]
  110. [CIKM 2021] Semi-Supervised and Self-Supervised Classification with Multi-View Graph Neural Networks [paper]
  111. [arXiv 2021] Graph Communal Contrastive Learning [paper]
  112. [arXiv 2021] Self-supervised Contrastive Attributed Graph Clustering [paper]
  113. [arXiv 2021] Adaptive Multi-layer Contrastive Graph Neural Networks [paper]
  114. [arXiv 2021] Graph-MVP: Multi-View Prototypical Contrastive Learning for Multiplex Graphs [paper]
  115. [arXiv 2021] Spatio-Temporal Graph Contrastive Learning [paper]
  116. [IJCAI 2021] Multi-Scale Contrastive Siamese Networks for Self-Supervised Graph Representation Learning [paper]
  117. [IJCAI 2021] Pairwise Half-graph Discrimination: A Simple Graph-level Self-supervised Strategy for Pre-training Graph Neural Networks [paper]
  118. [arXiv 2021] RRLFSOR: An Efficient Self-Supervised Learning Strategy of Graph Convolutional Networks [paper]
  119. [ICML 2021] Graph Contrastive Learning Automated [paper] [code]
  120. [ICML 2021] Self-supervised Graph-level Representation Learning with Local and Global Structure [paper] [code] 99.[arXiv 2021] Group Contrastive Self-Supervised Learning on Graphs [paper]
  121. [arXiv 2021] Multi-Level Graph Contrastive Learning [paper]
  122. [KDD 2021] Pre-training on Large-Scale Heterogeneous Graph [paper]
  123. [KDD 2021] Self-supervised Heterogeneous Graph Neural Network with Co-contrastive Learning [paper] [code]
  124. [arXiv 2021] Prototypical Graph Contrastive Learning [paper]
  125. [arXiv 2021] Graph Barlow Twins: A self-supervised representation learning framework for graphs [paper]
  126. [arXiv 2021] Self-Supervised Graph Learning with Proximity-based Views and Channel Contrast [paper]
  127. [arXiv 2021] FedGL: Federated Graph Learning Framework with Global Self-Supervision [paper]
  128. [IJCNN 2021] Node Embedding using Mutual Information and Self-Supervision based Bi-level Aggregation [paper]
  129. [arXiv 2021] Graph Representation Learning by Ensemble Aggregating Subgraphs via Mutual Information Maximization [paper]
  130. [arXiv 2021] Self-supervised Auxiliary Learning for Graph Neural Networks via Meta-Learning [paper]
  131. [arXiv 2021] Towards Robust Graph Contrastive Learning [paper]
  132. [arXiv 2021] Pre-Training on Dynamic Graph Neural Networks [paper]
  133. [WWW 2021] Graph Contrastive Learning with Adaptive Augmentation [paper] [code]
  134. [Arxiv 2020] Distance-wise Graph Contrastive Learning [paper]
  135. [Openreview 2020] Motif-Driven Contrastive Learning of Graph Representations [paper]
  136. [Openreview 2020] SLAPS: Self-Supervision Improves Structure Learning for Graph Neural Networks [paper]
  137. [Openreview 2020] TopoTER: Unsupervised Learning of Topology Transformation Equivariant Representations [paper]
  138. [Openreview 2020] Graph-Based Neural Network Models with Multiple Self-Supervised Auxiliary Tasks [paper]
  139. [NeurIPS 2020] Self-Supervised Graph Transformer on Large-Scale Molecular Data [paper]
  140. [NeurIPS 2020] Self-supervised Auxiliary Learning with Meta-paths for Heterogeneous Graphs [paper] [code]
  141. [NeurIPS 2020] Graph Contrastive Learning with Augmentations [paper] [code]
  142. [Arxiv 2020] Deep Graph Contrastive Representation Learning [paper]
  143. [ICML 2020] When Does Self-Supervision Help Graph Convolutional Networks? [paper] [code]
  144. [ICML 2020] Contrastive Multi-View Representation Learning on Graphs. [paper] [code]
  145. [ICML 2020 Workshop] Self-supervised edge features for improved Graph Neural Network training. [paper]
  146. [Arxiv 2020] Self-supervised Training of Graph Convolutional Networks. [paper]
  147. [Arxiv 2020] Self-Supervised Graph Representation Learning via Global Context Prediction. [paper]
  148. [KDD 2020] GPT-GNN: Generative Pre-Training of Graph Neural Networks. [pdf] [code]
  149. [KDD 2020] GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training. [pdf] [code]
  150. [Arxiv 2020] Graph-Bert: Only Attention is Needed for Learning Graph Representations. [paper] [code]
  151. [ICLR 2020] InfoGraph: Unsupervised and Semi-supervised Graph-Level Representation Learning via Mutual Information Maximization. [paper] [code]
  152. [ICLR 2020] Strategies for Pre-training Graph Neural Networks. [paper] [code]
  153. [KDD 2019 Workshop] SGR: Self-Supervised Spectral Graph Representation Learning. [paper]
  154. [ICLR 2019 workshop] Pre-Training Graph Neural Networks for Generic Structural Feature Extraction. [paper]
  155. [Arxiv 2019] Heterogeneous Deep Graph Infomax [paper] [code]
  156. [ICLR 2019] Deep Graph Informax. [paper] [code]

<a name="know"></a>

Knowledge-Enriched Pretraining Strategies

  1. [Arxiv 2022] KnowAugNet: Multi-Source Medical Knowledge Augmented Medication Prediction Network with Multi-Level Graph Contrastive Learning [paper]
  2. [Nature Machine Itelligence 2022] Geometry-enhanced molecular representation learning for property prediction[paper]
  3. [ICLR 2022] PRE-TRAINING MOLECULAR GRAPH REPRESENTATION WITH 3D GEOMETRY [paper] [code]
  4. [AAAI 2022] Molecular Contrastive Learning with Chemical Element Knowledge Graph [paper]
  5. [KDD 2021] MoCL: Data-driven Molecular Fingerprint via Knowledge-aware Contrastive Learning from Molecular Graph [paper] [code]
  6. [arXiv 2021] 3D Infomax improves GNNs for Molecular Property Prediction [paper] [code]
  7. [ICLR 2022] Chemical-Reaction-Aware Molecule Representation Learning [paper][code]

<a name="hard"></a>

Hard Negative Mining Strategies

  1. [ICML 2022]ProGCL: Rethinking Hard Negative Mining in Graph Contrastive Learning[paper][code(coming soon)]
  2. [SDM 2022] Structure-Enhanced Heterogeneous Graph Contrastive Learning [paper]
  3. [Signal Processing 2021] Negative Sampling Strategies for Contrastive Self-Supervised Learning of Graph Representations [paper]
  4. [IJCAI 2021] Graph Debiased Contrastive Learning with Joint Representation Clustering [paper]
  5. [IJCAI 2021] CuCo: Graph Representation with Curriculum Contrastive Learning [paper]
  6. [arXiv 2021] Debiased Graph Contrastive Learning [paper]

<a name="tunestrategies"></a>

Tuning Strategies

  1. [KDD 2021] Adaptive Transfer Learning on Graph Neural Networks [paper]
  2. [BioRxiv 2022] Towards Effective and Generalizable Fine-tuning for Pre-trained Molecular Graph Models[paper]
  3. [AAAI 2022] CODE: Contrastive Pre-training with Adversarial Fine-tuning for Zero-shot Expert Linking [paper] [code] 5 [Arxiv 2022]Fine-Tuning Graph Neural Networks via Graph Topology induced Optimal Transport [paper] <a name="Applications"></a>

Applications

  1. [The Journal of Chemical Physics] Transfer Learning using Attentions across Atomic Systems with Graph Neural Networks (TAAG) [paper]
  2. [SIGIR 2022] Are Graph Augmentations Necessary? Simple Graph Contrastive Learning for Recommendation[paper]
  3. [Arxiv 22] Protein Representation Learning by Geometric Structure Pretraining [paper]
  4. [Nature Communications 2021] Masked graph modeling for molecule generation [paper]
  5. [NPL 2022] How Does Bayesian Noisy Self-Supervision Defend Graph Convolutional Networks? [paper]
  6. [arXiv 2022] Self-supervised Graphs for Audio Representation Learning with Limited Labeled Data [paper]
  7. [arXiv 2022] Link Prediction with Contextualized Self-Supervision [paper]
  8. [arXiv 2022] Learning Robust Representation through Graph Adversarial Contrastive Learning [paper]
  9. [WWW 2021] Multi-view Graph Contrastive Representation Learning for Drug-drug Interaction Prediction [paper]
  10. [BIBM 2021] SGAT: a Self-supervised Graph Attention Network for Biomedical Relation Extraction [paper]
  11. [ICBD 2021] Session-based Recommendation via Contrastive Learning on Heterogeneous Graph [paper]
  12. [arXiv 2021] Graph Augmentation-Free Contrastive Learning for Recommendation [paper]
  13. [arXiv 2021] TCGL: Temporal Contrastive Graph for Self-supervised Video Representation Learning [paper]
  14. [NeurIPS 2021 Workshop] Contrastive Embedding of Structured Space for Bayesian Optimisation [paper]
  15. [ICCSNT 2021] Graph Data Augmentation based on Adaptive Graph Convolution for Skeleton-based Action Recognition [paper]
  16. [arXiv 2021] Pre-training Graph Neural Network for Cross Domain Recommendation [paper]
  17. [CIKM 2021] Social Recommendation with Self-Supervised Metagraph Informax Network [paper] [code]
  18. [arXiv 2021] Self-Supervised Learning for Molecular Property Prediction [paper]
  19. [arXiv 2021] Contrastive Graph Convolutional Networks for Hardware Trojan Detection in Third Party IP Cores [paper]
  20. [KBS 2021] Multi-aspect self-supervised learning for heterogeneous information network [paper]
  21. [arXiv 2021] Hyper Meta-Path Contrastive Learning for Multi-Behavior Recommendation [paper]
  22. [arXiv 2021] Generative and Contrastive Self-Supervised Learning for Graph Anomaly Detection [paper]
  23. [IJCAI 2021] CSGNN: Contrastive Self-Supervised Graph Neural Network for Molecular Interaction Prediction [paper]
  24. [arXiv 2021] GCCAD: Graph Contrastive Coding for Anomaly Detection [paper]
  25. [arXiv 2021] Contrastive Self-supervised Sequential Recommendation with Robust Augmentation [paper]
  26. [KDD 2021] Contrastive Multi-View Multiplex Network Embedding with Applications to Robust Network Alignment [paper]
  27. [arXiv 2021] Hop-Count Based Self-Supervised Anomaly Detection on Attributed Networks [paper]
  28. [arXiv 2021] Representation Learning for Networks in Biology and Medicine: Advancements, Challenges, and Opportunities [paper]
  29. [arXiv 2021] Drug Target Prediction Using Graph Representation Learning via Substructures Contrast [paper]
  30. [Arxiv 2021] Self-Supervised Multi-Channel Hypergraph Convolutional Network for Social Recommendation [paper] [code]
  31. [ICLR 2021] How to Find Your Friendly Neighborhood: Graph Attention Design with Self-Supervision [paper] [code]
  32. [WSDM 2021] Pre-Training Graph Neural Networks for Cold-Start Users and Items Representation [paper] [code]
  33. [ICML 2020] Graph-based, Self-Supervised Program Repair from Diagnostic Feedback. [paper]

<a name="others"></a>

Others

  1. [arXiv 2022] A Survey of Pretraining on Graphs: Taxonomy, Methods, and Applications [paper]
  2. [NeurIPS 2021 datasets and benchmark track] An Empirical Study of Graph Contrastive Learning [paper]
  3. [arXiv 2021] Evaluating Modules in Graph Contrastive Learning [paper] [code]
  4. [arXiv 2021] Graph Self-Supervised Learning: A Survey [paper]
  5. [arXiv 2021] Self-Supervised Learning of Graph Neural Networks: A Unified Review [paper]
  6. [Arxiv 2020] Self-supervised Learning on Graphs: Deep Insights and New Direction. [paper] [code]
  7. [ICLR 2019 Workshop] Can Graph Neural Networks Go "Online"? An Analysis of Pretraining and Inference. [paper]

<a name="PGMs"></a>

Open-Sourced Pretrained Graph Models

PGMsArchitecturePretraining Database# Params.Download Link
Hu et al.5-layer GINZINC15 (2M) + ChEMBL (456K)~ 2MLink
Graph-BERTGraph TransformerCora + CiteSeer + PubMedN/ALink
GraphCL5-layer GINZINC15 (2M) + ChEMBL (456K)~ 2MLink
GPT-GNNHGTOAG + AmazonN/ALink
GCC5-layer GINAcademia + DBLP + IMDB + Facebook + LiveJournal<1MLink
JOAO5-layer GINZINC15 (2M) + ChEMBL (456K)~ 2MLink
AD-GCL5-layer GINZINC15 (2M) + ChEMBL (456K)~ 2MN/A
GraphLog5-layer GINZINC15 (2M) + ChEMBL (456K)~ 2MLink
GROVERGTransformerZINC + ChEMBL (10M)48M ~ 100MLink
MGSSL5-layer GINZINC15 (250K)~ 2MLink
CPT-HGHGTDBLP + YELP + AminerN/AN/A
MPGMolGNetZINC + ChEMBL (11M)53MN/A
LP-Info5-layer GINZINC15 (2M) + ChEMBL (456K)~ 2MLink
SimGRACE5-layer GINZINC15 (2M) + ChEMBL (456K)~ 2MLink
MolCLRGCN + GINPubChem (10M)N/ALink
DMPDeeperGCN + TransformerPubChem (110M)104.1 MN/A
ChemRL-GEMGeoGNNZINC15 (20M)N/ALink
KCLGCN + KMPNNZINC15 (250K)< 1MN/A
3D InfomaxPNAQM9(50K) + GEOM-drugs(140K) + QMugs(620K)N/ALink
GraphMVPGIN + SchNetGEOM (50k)~ 2MLink

<a name="Datasets"></a>

Pre-traing Datasets

NameCategoryDownload Link
ZINCMolecular GraphLink
CheMBLMolecular GraphLink
PubChemMolecular GraphLink
QM9Molecular GraphLink
QMugsMolecular GraphLink
GEOMMolecular GraphLink
<a name="cite"></a>

Citation

@inproceedings{
xia2022pretraining,
title={Pre-training Graph Neural Networks for Molecular Representations: Retrospect and Prospect},
author={Jun Xia and Yanqiao Zhu and Yuanqi Du and Stan Z. Li},
booktitle={ICML 2022 2nd AI for Science Workshop},
year={2022},
url={https://openreview.net/forum?id=dhXLkrY2Nj3}
}
@article{xia2023systematic,
  title={A Systematic Survey of Chemical Pre-trained Models},
  author={Xia, Jun and Zhu, Yanqiao and Du, Yuanqi and Liu, Yue and Li, Stan Z},
  journal={IJCAI},
  year={2023}
}

Acknowledgements