Awesome
LitMatter
A template for rapid experimentation and scaling deep learning models on molecular and crystal graphs.
How to use
- Clone this repository and start editing, or save it and use it as a template for new projects.
- Edit
lit_models/models.py
with the PyTorch code for your model of interest. - Edit
lit_data/data.py
to load and process your PyTorch datasets. - Perform interactive experiments in
prototyping.py
. - Scale network training to any number of GPUs using the example batch scripts.
Principles
LitMatter uses PyTorch Lightning to organize PyTorch code so scientists can rapidly experiment with geometric deep learning and scale up to hundreds of GPUs without difficulty. Many amazing applied ML methods (even those with open-source code) are never used by the wider community because the important details are buried in hundreds of lines of boilerplate code. It may require a significant engineering effort to get the method working on a new dataset and in a different computing environment, and it can be hard to justify this effort before verifying that the method will provide some advantage. Packaging your code with the LitMatter template makes it easy for other researchers to experiment with your models and scale them beyond common benchmark datasets.
Features
- Maximum flexibility. LitMatter supports arbitrary PyTorch models and dataloaders.
- Eliminate boilerplate. Engineering code is abstracted away, but still accessible if needed.
- Full end-to-end pipeline. Data processing, model construction, training, and inference can be launched from the command line, in a Jupyter notebook, or through a SLURM job.
- Lightweight. Using the template is easier than not using it; it reduces infrastructure overhead for simple and complex deep learning projects.
Examples
The example notebooks show how to use LitMatter to scale model training for different applications.
- Prototyping GNNs - train an equivariant graph neural network to predict quantum properties of small molecules.
- Neural Force Fields - train a neural force field on molecular dynamics trajectories of small molecules.
- DeepChem - train a PyTorch model in DeepChem on a MoleculeNet dataset.
- 🤗 - train a 🤗 language model to generate molecules.
Note that these examples have additional dependencies beyond the core depdencies of LitMatter.
References
If you use LitMatter for your own research and scaling experiments, please cite the following works: Frey, Nathan C., et al. "Neural Scaling of Deep Chemical Models." _Nature Machine Intelligence (2023)
@article{frey2022neural,
title={Neural scaling of deep chemical models},
author={Frey, Nathan and Soklaski, Ryan and Axelrod, Simon and Samsi, Siddharth and Gomez-Bombarelli, Rafael and Coley, Connor and Gadepally, Vijay},
journal={Nature Machine Intelligence},
doi={10.1038/s42256-023-00740-3},
year={2023}
}
@inproceedings{frey2021scalable,
title={Scalable Geometric Deep Learning on Molecular Graphs},
author={Frey, Nathan C and Samsi, Siddharth and McDonald, Joseph and Li, Lin and Coley, Connor W and Gadepally, Vijay},
booktitle={NeurIPS 2021 AI for Science Workshop},
year={2021}
}
Please also cite the relevant frameworks: PyG, PyTorch Distributed, PyTorch Lightning,
and any extensions you use: 🤗, DeepChem, NFFs, etc.
Extensions
When you're ready to upgrade to fully configurable, reproducible, and scalable workflows, use hydra-zen. hydra-zen integrates seamlessly with LitMatter to self-document ML experiments and orchestrate multiple training runs for extensive hyperparameter sweeps.
Environment
Version management in Python is never fun and deep learning dependencies are always changing, but here are the latest tested versions of key dependencies for LitMatter
- Python 3.8
- Pytorch Lightning 1.5.1
- Pytorch 1.10.0
Disclaimer
DISTRIBUTION STATEMENT A. Approved for public release. Distribution is unlimited.
© 2021 MASSACHUSETTS INSTITUTE OF TECHNOLOGY
Subject to FAR 52.227-11 – Patent Rights – Ownership by the Contractor (May 2014)
SPDX-License-Identifier: MIT
This material is based upon work supported by the Under Secretary of Defense for Research and Engineering under Air Force Contract No. FA8702-15-D-0001. Any opinions, findings, conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the Under Secretary of Defense for Research and Engineering.
The software/firmware is provided to you on an As-Is basis.