Home

Awesome

<!-- Allow this file to not have a first line heading --> <!-- markdownlint-disable-file MD041 --> <!-- inline html --> <!-- markdownlint-disable-file MD033 --> <div align="center">

🔦 pdm-plugin-torch

A utility tool for selecting torch backend and version.

Embark Embark Build status Docs status pdm-managed

</div>

What it does

Due to torch being published in many different variants with local versions to signify the underlying API, it is hard to integrate into a lockfile-based workflow. This is due to the local versions - +cu111, +cpu, and so on - being considered the "same" package, so you can't resolve two at the same time.

This tool generate multiple lockfiles only for torch and allows you to use a pdm subcommand (pdm torch) to select the one you want.

Configuration

These are the supported options:

[tool.pdm.plugin.torch]
dependencies = [
   "torch==1.10.2"
]
lockfile = "torch.lock"
enable-cpu = true

enable-rocm = true
rocm-versions = ["4.2"]

enable-cuda = true
cuda-versions = ["cu111", "cu113"]

Installation

PDM supports specifying plugin-dependencies in your pyproject.toml, which is the suggested installation method. Note that in pdm-plugin-torch versions before 23.4.0, our configuration was in tool.pdm.plugins.torch. If upgrading, you'll need to also change that to tool.pdm.plugin.torch.

[tool.pdm]
plugins = ["pdm-plugin-torch==$VERSION"]

Contribution

Contributor Covenant

We welcome community contributions to this project.

Please read our Contributor Guide for more information on how to get started. Please also read our Contributor Terms before you make any contributions.

Any contribution intentionally submitted for inclusion in an Embark Studios project, shall comply with the Rust standard licensing model (MIT OR Apache 2.0) and therefore be dual licensed as described below, without any additional terms or conditions:

License

This contribution is dual licensed under EITHER OF

at your option.

For clarity, "your" refers to Embark or any other licensee/user of the contribution.