Home

Awesome

Torchélie

<img src="https://github.com/Vermeille/Torchelie/blob/master/logo.png" height="200"/>

License from GitHub

GitHub Actions - Tests status GitHub last commit Read the Docs build status

Torchélie is a set of tools for PyTorch. It includes losses, optimizers, algorithms, utils, layers, models and training loops.

Feedback is absolutely welcome.

You may want to read the detailed docs

Installation

pip install git+https://github.com/vermeille/Torchelie

It depends on Pytorch (obvi), and has an optional dependency on OpenCV for some transforms (Canny, as of today). It also depends on Visdom for realtime visualizations, plotting, etc.

To install visdom: pip install visdom. Then, you need to run a Visdom server with python -m visdom.server, direct your browser to http://localhost:8097. Now you're ready to use VisdomLogger and enjoy realtime tracking of your experiments.

⚠ WARNINGS ⚠

Torchelie API is beta and can be a bit unstable. Minor breaking changes can happen.

Code, README, docs and tests might be out of sync in general. Please tell me if you notice anything wrong.

Torchelie Hello World

Let's say you want to do the hello-world of deep learning: MNIST handwritten digits classification. Let's also assume that you already have your training and testing datasets organised properly, e.g. coming from the Kaggle archive:

$ tree mnist_png

mnist_png
├── testing
│   ├── 0
│   ├── 1
│   ├── 2
│   ├── 3
│   ├── 4
│   ├── 5
│   ├── 6
│   ├── 7
│   ├── 8
│   └── 9
└── training
    ├── 0
    ├── 1
    ├── 2
    │   ├── 10009.png
    │   ├── 10016.png
    │   └── [...]
    ├── 3
    ├── 4
    ├── 5
    ├── 6
    ├── 7
    ├── 8
    └── 9

Torchelie comes with a classification "recipe" out-of-the-box, which can be used directly to train your a model straight from the command line:

$ python3 -m torchelie.recipes.classification --trainset mnist_png/training --testset mnist_png/testing

[...]
 | Ep. 0 It 1 | {'lr_0': '0.0100', 'acc': '0.0938', 'loss': '3.1385'}
 | Ep. 0 It 11 | {'lr_0': '0.0100', 'acc': '0.2017', 'loss': '2.4109'}
 | Ep. 0 It 21 | {'lr_0': '0.0100', 'acc': '0.3185', 'loss': '2.0410'}
 | Ep. 0 It 31 | {'lr_0': '0.0100', 'acc': '0.3831', 'loss': '1.8387'}
 | Ep. 0 It 41 | {'lr_0': '0.0100', 'acc': '0.4451', 'loss': '1.6513'}
[...]
Test | Ep. 1 It 526 | [...] 'acc': '0.9799', 'loss': '0.0797' [...]
 | Ep. 1 It 556 | {'lr_0': '0.0100', 'acc': '0.9588', 'loss': '0.1362'}
 | Ep. 1 It 566 | {'lr_0': '0.0100', 'acc': '0.9606', 'loss': '0.1341'}

Want to run it on your laptop which doesnt have a GPU? Simply add the --device cpu option!

With a simple use case and a properly organized dataset, we already saw how Torchelie can help experiment quickly. But what just happened?

The classification recipe is a whole ready-to-use training loop which:

The cool thing is that all these building blocks are available!

torchelie.recipes

Classes implementing full algorithms, from training to usage

torchelie.utils

Functions:

torchelie.nn

Debug modules:

Normalization modules:

Misc modules:

Container modules:

Model manipulation modules:

Net Blocks:

torchelie.models

Debug models:

torchelie.loss

Modules:

Functions (torchelie.loss.functional):

torchelie.loss.gan

Each submodule is a GAN loss function. They all contain three methods: real(x) and fake(x) to train the discriminator, and ŋenerated(x) to improve the Generator.

Available:

torchelie.transforms

Torchvision-like transforms:

torchelie.transforms.differentiable

Contains some transforms that can be backpropagated through. Its API is unstable now.

torchelie.lr_scheduler

Classes:

torchelie.datasets

torchelie.datasets.debug

torchelie.metrics

torchelie.opt

torchelie.data_learning

Data parameterization for optimization, like neural style or feature viz.

Modules:

Testing

Testing without OpenCV

Since OpenCV is an optional dependency, you might want to run tests in such a setup (therefore not testing Canny). You can do so by excluding the require_opencv pytest custom marker like so:

pytest -m "not require_opencv"

Contributing

Code format

Code is formatted using YAPF.

For now, the CI doesn't check for code format, and the config files for yapf isn't there, but do your best to format your code using YAPF (or at least comply with PEP8 🙂)

Lint

Code is linted using Flake8. Do your best to send code that don't make it scream too loud 😉

You can run it like this:

flake8 torchelie

Type checking

Despite typing being optional in Python, type hints can save a lot of time on a project such as Torchélie. This project is type-checked using mypy. Make sure it passes successfully, and consider adding type hints where it makes sense to do so when contributing code!

You can run it like this:

mypy torchelie

Variable names

Common widespread naming best practices apply.

That being said, please specifically try to avoid using l as a variable name, even for iterators. First, because of E741 (see PEP8 "names to avoid"), second because in the context of Torchélie it might mean layer, label, loss, length, line, or other words that are spread among the codebase. Therefore, using l would make it considerably harder to understand code when reading it.