Home

Awesome

Awesome Self-Supervised Learning for Time Series (SSL4TS)

Awesome PRs Welcome Stars Visits Badge

<!-- ![Forks](https://img.shields.io/github/forks/qingsongedu/awesome-self-supervised-learning-timeseries) -->

A professionally curated list of awesome resources (paper, code, data, etc.) on Self-Supervised Learning for Time Series (SSL4TS), which is the first work to comprehensively and systematically summarize the recent advances of Self-Supervised Learning for modeling time series data to the best of our knowledge.

We will continue to update this list with the newest resources. If you find any missed resources (paper/code) or errors, please feel free to open an issue or make a pull request.

For general AI for Time Series (AI4TS) Papers, Tutorials, and Surveys at the Top AI Conferences and Journals, please check This Repo.

Survey Paper (IEEE TPAMI 2024)

Self-Supervised Learning for Time Series Analysis: Taxonomy, Progress, and Prospects

Kexin Zhang, Qingsong Wen, Chaoli Zhang, Rongyao Cai, Ming Jin, Yong Liu, James Zhang, Yuxuan Liang, Guansong Pang, Dongjin Song, Shirui Pan.

If you find this repository helpful for your work, please kindly cite our TPAMI'24 paper.

@article{zhang2024ssl4ts,
  title={Self-Supervised Learning for Time Series Analysis: Taxonomy, Progress, and Prospects},
  author={Kexin Zhang and Qingsong Wen and Chaoli Zhang and Rongyao Cai and Ming Jin and Yong Liu and James Zhang and Yuxuan Liang and Guansong Pang and Dongjin Song and Shirui Pan},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)},
  year={2024}
}

Taxonomy of Self-Supervised Learning for Time Series

<!-- ![xxx](SSL4TS_taxonomy.jpg) -->

<img src="SSL4TS_taxonomy.jpg" width=900 align=middle> <br />

<!-- ![xxx](generative_adversarial_ssl4ts.jpg) -->

<img src="generative_adversarial_ssl4ts.jpg" width=900 align=middle> <br />

<!-- ![xxx](contrastive_ssl4ts.jpg) -->

<img src="contrastive_ssl4ts.jpg" width=900 align=middle> <br />

Category of Self-Supervised Learning for Time Series

Generative-based Methods on SSL4TS

In this category, the pretext task is to generate the expected data based on a given view of the data. In the context of time series modeling, the commonly used pretext tasks include using the past series to forecast the future windows or specific time stamps, using the encoder and decoder to reconstruct the input, and forecasting the unseen part of the masked time series. This section sorts out the existing self-supervised representation learning methods in time series modeling from the perspectives of autoregressive-based forecasting, autoencoder-based reconstruction, and diffusion-based generation. It should be noted that autoencoder-based reconstruction task is also viewed as an unsupervised framework. In the context of SSL, we mainly use the reconstruction task as a pretext task, and the final goal is to obtain the representations through autoencoder models. The illustration of the generative-based SSL for time series is shown in Fig. 3.

Autoregressive-based forecasting

Autoencoder-based reconstruction

Diffusion-based generation

Contrastive-based Methods on SSL4TS

Contrastive learning is a widely used self-supervised learning strategy, showing a strong learning ability in computer vision and natural language processing. Unlike discriminative models that learn a mapping rule to true labels and generative models that try to reconstruct inputs, contrastive-based methods aim to learn data representations by contrasting between positive and negative samples. Specifically, positive samples should have similar representations, while negative samples have different representations. Therefore, the selection of positive samples and negative samples is very important to contrastive-based methods. This section sorts out and summarizes the existing contrastive-based methods in time series modeling according to the selection of positive and negative samples. The illustration of the contrastive-based SSL for time series is shown in Fig. 4.

Sampling contrast

Prediction contrast

Augmentation contrast

Prototype contrast

Expert knowledge contrast

Adversarial-based Methods on SSL4TS

Adversarial-based self-supervised representation learning methods utilize generative adversarial networks (GANs) to construct pretext tasks. GAN contains a generator $\mathcal{G}$ and a discriminator $\mathcal{D}$. The generator $\mathcal{G}$ is responsible for generating synthetic data similar to real data, while the discriminator $\mathcal{D}$ is responsible for determining whether the generated data is real data or synthetic data. Therefore, the goal of the generator is to maximize the decision failure rate of the discriminator, and the goal of the discriminator is to minimize its failure rate. According to the final task, the existing adversarial-based representation learning methods can be divided into time series generation and imputation, and auxiliary representation enhancement. The illustration of the adversarial-based SSL for time series is shown in Fig. 5.

Time series generation and imputation

Auxiliary representation enhancement

Applications and Datasets on SSL4TS

Anomaly Detection

DatasetSizeDimensionSourceLinkComment
PSM132,481 / 87,84126[paper][link]AnRa: 27.80%
SMD708,405 / 708,40538[paper][link]AnRa: 4.16%
MSL58,317 / 73,72955[paper][link]AnRa: 10.72%
SMAP135,183 / 427,61725[paper][link]AnRa: 13.13%
SWaT475,200 / 449,91951[paper][link]AnRa: 12.98%
WADI1,048,571 / 172,801103[paper][link]AnRa: 5.99%

Forecasting

DatasetSizeDimensionSourceLinkComment
ETTh17,4207[paper][link]SaIn: 1h
ETTm69,6807[paper][link]SaIn: 15min
Wind10,95728Non[link]SaIn: 1day
Electricity26,304321Non[link]SaIn: 1hour
ILI9667Non[link]SaIn: 1weak
Weather52,69621Non[link]SaIn: 10min
Traffic17,544862Non[link]SaIn: 1hour
Exchange7,5888[paper][link]SaIn: 1day
Solar52,560137Non[link]SaIn: 10min

Classification and Clustering

DatasetSizeDimensionSourceLinkComment
HAR17,3056 / 173,0569[paper][link]Classes: 6
UCR 130128*M1[paper][link]N/A
UEA 3030*MD[paper][link]N/A

Time Series Related Survey

Self-Supervised Learning Tutorial/Survey in Other Disciplines