Home

Awesome

<div align="center">[IJCAI-24] Spatial-Temporal-Decoupled Masked Pre-training for Spatiotemporal Forecasting </div>

Framework

Preprint Link (All six datasets [PEMS03, 04, 07, 08, PEMS-BAY, and METR-LA] are included.)

Arxiv link IJCAI link

Google Scholar

Due to the modification of STD-MAE's title, you can simply search for "STD-MAE" in Google Scholar to get our article.

Citation

[!NOTE] If you find this repository useful for your research, please cite our work :)

@article{gao2023spatio,
  title={Spatio-Temporal-Decoupled Masked Pre-training for Traffic Forecasting},
  author={Gao, Haotian and Jiang, Renhe and Dong, Zheng and Deng, Jinliang and Song, Xuan},
  journal={arXiv preprint arXiv:2312.00516},
  year={2023}
}

2024 Version

@article{gao2024spatial,
 title={Spatial-Temporal-Decoupled Masked Pre-training for Spatiotemporal Forecasting},
 author={Gao, Haotian and Jiang, Renhe and Dong, Zheng and Deng, Jinliang and Ma, Yuxin and Song, Xuan},
 journal={arXiv preprint arXiv:2312.00516},
 year={2024},
 publisher={Apr}
}

Performance on Spatiotemporal Forecasting Benchmarks

PWC PWC PWC PWC PWC PWC PWC PWC Main results.

METR-LAPEMS-BAY
PEMSD7M&L

You can check more baseline results at Torch-MTS.

💿 Dependencies

OS

Linux systems (e.g. Ubuntu and CentOS).

Python

The code is built based on Python 3.9, PyTorch 1.13.0, and EasyTorch. You can install PyTorch following the instruction in PyTorch.

Miniconda or Anaconda are recommended to create a virtual python environment.

We implement our code based on BasicTS.

Other Dependencies

pip install -r requirements.txt

Getting started

STD-MAE has now been integrated into BasicTS. You can also check many baselines there.

Download Data

You can download data from BasicTS and unzip it.

Preparing Data

You can pre-process all datasets by

cd /path/to/your/project
bash scripts/data_preparation/all.sh

Then the dataset directory will look like this:

datasets
   ├─PEMS03
   ├─PEMS04
   ├─PEMS07
   ├─PEMS08
   ├─raw_data
   |    ├─PEMS03
   |    ├─PEMS04
   |    ├─PEMS07
   |    ├─PEMS08
   ├─README.md

Pre-training on S-MAE and T-MAE

cd /path/yourproject

Then run the folloing command to run in Linux screen.

screen -d -m python stdmae/run.py --cfg='stdmae/TMAE_PEMS03.py' --gpus='0' 

screen -d -m python stdmae/run.py --cfg='stdmae/TMAE_PEMS04.py' --gpus='0'

screen -d -m python stdmae/run.py --cfg='stdmae/TMAE_PEMS07.py' --gpus='0' 

screen -d -m python stdmae/run.py --cfg='stdmae/TMAE_PEMS08.py' --gpus='0'

screen -d -m python stdmae/run.py --cfg='stdmae/SMAE_PEMS03.py' --gpus='0' 

screen -d -m python stdmae/run.py --cfg='stdmae/SMAE_PEMS04.py' --gpus='0'

screen -d -m python stdmae/run.py --cfg='stdmae/SMAE_PEMS07.py' --gpus='0' 

screen -d -m python stdmae/run.py --cfg='stdmae/SMAE_PEMS08.py' --gpus='0'

Downstream Predictor

After pre-training , copy your pre-trained best checkpoint to mask_save/. For example:

cp checkpoints/TMAE_200/064b0e96c042028c0ec44856f9511e4c/TMAE_best_val_MAE.pt mask_save/TMAE_PEMS04_864.pt
cp checkpoints/SMAE_200/50cd1e77146b15f9071b638c04568779/SMAE_best_val_MAE.pt mask_save/SMAE_PEMS04_864.pt

Then run the predictor as :

screen -d -m python stdmae/run.py --cfg='stdmae/STDMAE_PEMS04.py' --gpus='0' 

screen -d -m python stdmae/run.py --cfg='stdmae/STDMAE_PEMS03.py' --gpus='0' 

screen -d -m python stdmae/run.py --cfg='stdmae/STDMAE_PEMS08.py' --gpus='0'

screen -d -m python stdmae/run.py --cfg='stdmae/STDMAE_PEMS07.py' --gpus='0' 

IJCAI Poster

Poster