Home

Awesome

(ICLR'24) Rethinking Channel Dependence for Multivariate Time Series Forecasting: Learning from Leading Indicators

Stars Visits Badge

This repo is the official Pytorch implementation of Rethinking Channel Dependence for Multivariate Time Series Forecasting: Learning from Leading Indicators.

Takeaways

Scripts

An example:

python -u run_longExp.py --dataset Weather --model DLinear --lift --seq_len 336 --pred_len 96 --leader_num 4 --state_num 8 --learning_rate 0.0005

The scripts/ directory contains our scripts for re-experiments but does not cover all datasets. We slightly revised our method after the paper submission, while we have not re-run all experiments yet due to limited computing resources. You can perform hyperparameter tuning on your own if necessary.

It is recommended to obtain a pretrained and frozen backbone first, in order to reduce the time cost of selecting LIFT's hyperparameters. (Add args --pretrain --freeze into your scripts to load a frozen backbone.)

Precomputing

Our implementation precomputes the leading indicators and the leading steps at all time steps over the dataset, which are saved to the prefetch/ directory.

Given a frozen backbone, we also precompute the backbone's predictions only once and save them to the results/ directory.

The LIFT module directly takes in the tensors of predictions without recomputing the lead-lag relationships (and the backbone's prediction if --pretrain --freeze).

To avoid repeatedly loading the input tensors from RAM to GPU memory, we keep all the input tensors on the GPU memory by default. You can set --pin_gpu False if your GPU memory is limited.

Datasets

All benchmarks can be downloaded from Google Drive.

Requirements

pip3 install -r requirements.txt

Citation

If you find this useful for your work, please consider citing it as follows:

@inproceedings{
LIFT,
title={Rethinking Channel Dependence for Multivariate Time Series Forecasting: Learning from Leading Indicators},
author={Lifan Zhao and Yanyan Shen},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=JiTVtCUOpS}
}