Home

Awesome

Unsupervised Scalable Representation Learning for Multivariate Time Series -- Code

This is the code corresponding to the experiments conducted for the work "Unsupervised Scalable Representation Learning for Multivariate Time Series" (Jean-Yves Franceschi, Aymeric Dieuleveut and Martin Jaggi) [NeurIPS] [arXiv] [HAL], presented at NeurIPS 2019. A previous version was presented at the 2nd LLD workshop at ICLR 2019.

Requirements

Experiments were done with the following package versions for Python 3.6:

This code should execute correctly with updated versions of these packages.

Datasets

The datasets manipulated in this code can be downloaded on the following locations:

Files

Core

Tests

Results and Visualization

Usage

Training on the UCR and UEA archives

To train a model on the Mallat dataset from the UCR archive:

python3 ucr.py --dataset Mallat --path path/to/Mallat/folder/ --save_path /path/to/save/models --hyper default_hyperparameters.json [--cuda --gpu 0]

Adding the --load option allows to load a model from the specified save path. Training on the UEA archive with uea.py is done in a similar way.

Further Documentation

See the code documentation for more details. ucr.py, uea.py, transfer_ucr.py, combine_ucr.py and combine_uea.py can be called with the -h option for additional help.

Hyperparameters

Hyperparameters are described in Section S2.2 of the paper.

For the UCR and UEA hyperparameters, two values were switched by mistake. One should read, as reflected in the example configuration file:

instead of

Pretrained Models

Pretrained models are downloadable at https://data.lip6.fr/usrlts/.