Home

Awesome

LitMatter

A template for rapid experimentation and scaling deep learning models on molecular and crystal graphs.

How to use

  1. Clone this repository and start editing, or save it and use it as a template for new projects.
  2. Edit lit_models/models.py with the PyTorch code for your model of interest.
  3. Edit lit_data/data.py to load and process your PyTorch datasets.
  4. Perform interactive experiments in prototyping.py.
  5. Scale network training to any number of GPUs using the example batch scripts.

Principles

LitMatter uses PyTorch Lightning to organize PyTorch code so scientists can rapidly experiment with geometric deep learning and scale up to hundreds of GPUs without difficulty. Many amazing applied ML methods (even those with open-source code) are never used by the wider community because the important details are buried in hundreds of lines of boilerplate code. It may require a significant engineering effort to get the method working on a new dataset and in a different computing environment, and it can be hard to justify this effort before verifying that the method will provide some advantage. Packaging your code with the LitMatter template makes it easy for other researchers to experiment with your models and scale them beyond common benchmark datasets.

Features

Examples

The example notebooks show how to use LitMatter to scale model training for different applications.

Note that these examples have additional dependencies beyond the core depdencies of LitMatter.

References

If you use LitMatter for your own research and scaling experiments, please cite the following works: Frey, Nathan C., et al. "Neural Scaling of Deep Chemical Models." _Nature Machine Intelligence (2023)

@article{frey2022neural,
  title={Neural scaling of deep chemical models},
  author={Frey, Nathan and Soklaski, Ryan and Axelrod, Simon and Samsi, Siddharth and Gomez-Bombarelli, Rafael and Coley, Connor and Gadepally, Vijay},
  journal={Nature Machine Intelligence},
  doi={10.1038/s42256-023-00740-3},
  year={2023}
}

Frey, Nathan C., et al. "Scalable Geometric Deep Learning on Molecular Graphs." NeurIPS 2021 AI for Science Workshop. 2021.

@inproceedings{frey2021scalable,
  title={Scalable Geometric Deep Learning on Molecular Graphs},
  author={Frey, Nathan C and Samsi, Siddharth and McDonald, Joseph and Li, Lin and Coley, Connor W and Gadepally, Vijay},
  booktitle={NeurIPS 2021 AI for Science Workshop},
  year={2021}
}

Please also cite the relevant frameworks: PyG, PyTorch Distributed, PyTorch Lightning,

and any extensions you use: 🤗, DeepChem, NFFs, etc.

Extensions

When you're ready to upgrade to fully configurable, reproducible, and scalable workflows, use hydra-zen. hydra-zen integrates seamlessly with LitMatter to self-document ML experiments and orchestrate multiple training runs for extensive hyperparameter sweeps.

Environment

Version management in Python is never fun and deep learning dependencies are always changing, but here are the latest tested versions of key dependencies for LitMatter

Disclaimer

DISTRIBUTION STATEMENT A. Approved for public release. Distribution is unlimited.

© 2021 MASSACHUSETTS INSTITUTE OF TECHNOLOGY

Subject to FAR 52.227-11 – Patent Rights – Ownership by the Contractor (May 2014)
SPDX-License-Identifier: MIT

This material is based upon work supported by the Under Secretary of Defense for Research and Engineering under Air Force Contract No. FA8702-15-D-0001. Any opinions, findings, conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the Under Secretary of Defense for Research and Engineering.

The software/firmware is provided to you on an As-Is basis.