Home

Awesome

Hyperspherical Variational Auto-Encoders

Pytorch implementation of Hyperspherical Variational Auto-Encoders

Overview

This library contains a Pytorch implementation of the hyperspherical variational auto-encoder, or S-VAE, as presented in [1](http://arxiv.org/abs/1804.00891). Check also our blogpost (https://nicola-decao.github.io/s-vae).

Dependencies

Installation

To install, run

$ python setup.py install

Structure

Usage

Please have a look into the examples folder. We adapted our implementation to follow the structure of the Pytorch probability distributions.

Please cite [1] in your work when using this library in your experiments.

Sampling von Mises-Fisher

To sample the von Mises-Fisher distribution we follow the rejection sampling procedure as outlined by Ulrich, 1984. This simulation pipeline is visualized below:

<p align="center"> <img src="https://i.imgur.com/aK1ze0z.png" alt="blog toy1"/> </p>

Note that as is a scalar, this approach does not suffer from the curse of dimensionality. For the final transformation, , a Householder reflection is utilized.

Feedback

For questions and comments, feel free to contact Nicola De Cao or Tim Davidson.

License

MIT

Citation

[1] Davidson, T. R., Falorsi, L., De Cao, N., Kipf, T.,
and Tomczak, J. M. (2018). Hyperspherical Variational
Auto-Encoders. 34th Conference on Uncertainty in Artificial Intelligence (UAI-18).

BibTeX format:

@article{s-vae18,
  title={Hyperspherical Variational Auto-Encoders},
  author={Davidson, Tim R. and
          Falorsi, Luca and
          De Cao, Nicola and
          Kipf, Thomas and
          Tomczak, Jakub M.},
  journal={34th Conference on Uncertainty in Artificial Intelligence (UAI-18)},
  year={2018}
}