Awesome
diffsort - Differentiable Sorting Networks
Official implementation for our ICLR 2022 Paper "Monotonic Differentiable Sorting Networks" and our ICML 2021 Paper "Differentiable Sorting Networks for Scalable Sorting and Ranking Supervision". In this work, we leverage classic sorting networks and relax them to propose a new differentiable sorting function: diffsort. This allows propagating gradients through (an approximation of) the sorting / ranking function / operation. Herein, diffsort outperforms existing differentiable sorting functions on the four-digit MNIST and the SVHN sorting tasks. In this repo, we present the PyTorch implementation of differentiable sorting networks.
Video @ Youtube.
๐ป Installation
diffsort
can be installed via pip from PyPI with
pip install diffsort
Or from source, e.g., in a virtual environment like
virtualenv -p python3 .env1
. .env1/bin/activate
pip install .
๐ฉโ๐ป Usage
import torch
from diffsort import DiffSortNet
vector_length = 2**4
vectors = torch.randperm(vector_length, dtype=torch.float32, device='cpu', requires_grad=True).view(1, -1)
vectors = vectors - 5.
# sort using a bitonic-sorting-network
sorter = DiffSortNet('bitonic', vector_length, steepness=5)
sorted_vectors, permutation_matrices = sorter(vectors)
print(sorted_vectors)
Math vs. Code Convention
In the code, we follow the convention that allows
sort(x) = x @ P
whereP
is the permutation matrix (P = sorter(x)[1]
). In contrast, in typical mathematical (written) notation, we would multipy as $\mathrm{sort}(x) = P \cdot x$. Accordingly,P
= $P^\top$, i.e., there is a transpose between typical mathematical permutation notation (which is also used in the papers) and the permutation matrices returned by the code.
๐งช Experiments
You can find the main experiment in this Colab notebook.
You can run the four-digit MNIST experiment as
python experiments/main.py -n 5 -m odd_even -s 10 -d mnist
or for the bitonic network
python experiments/main.py -n 16 -m bitonic -s 20 -d mnist
or for SVHN
python experiments/main.py -n 5 -m odd_even -s 10 -d svhn
๐ Citing
@inproceedings{petersen2022monotonic,
title={Monotonic Differentiable Sorting Networks},
author={Petersen, Felix and Borgelt, Christian and Kuehne, Hilde and Deussen, Oliver},
booktitle={International Conference on Learning Representations (ICLR)},
year={2022}
}
@inproceedings{petersen2021diffsort,
title={Differentiable Sorting Networks for Scalable Sorting and Ranking Supervision},
author={Petersen, Felix and Borgelt, Christian and Kuehne, Hilde and Deussen, Oliver},
booktitle={International Conference on Machine Learning (ICML)},
year={2021}
}
License
diffsort
is released under the MIT license. See LICENSE for additional details about it.