Home

Awesome

Network of Tensor Time Series

This is the PyTorch implementation of the paper:

Baoyu Jing, Hanghang Tong and Yada Zhu, Network of Tensor Time Series, WWW'2021

Requirements

Packages can be installed via: pip install -r requirements.txt

Data Preparation

  1. Formulation. Formulate the co-evolving time series (or multi-variate time series) as a tensor time series. The temporal snapshot should be an M-dimensional tensor. Note that vector and matrix are special cases of tensor.
  2. Normalization. For each single time series within the tensor time series, use z-score of the training split to normalize the values.
  3. Graph construction. The m-th dimension of the tensor can be associated with a graph, which is represented by the adjacency matrix equation. The adjacency matrix should be normalized by equation. Note that if a dimension is not associated with a network, then use the identity matrix.
  4. Store the values of the tensor time series and the adjacency matrices in values.pkl and networks.pkl. Store the indicators for training, validation and testing in train_idx.pkl, val_idx.pkl and test.pkl.

Training

  1. Specify the mode for training: train (only training) ortrain-eval (evaluating the model after each epoch).
  2. Specify the task: missing (missing value recovery) and future (future value prediction).
  3. Specify the paths of the configurations for the model and training.

python main.py -cm ./configs/model.yml -cr ./configs/run_missing.yml -m train -t missing

Evaluation

  1. Specify the mode: eval
  2. Specify the task: missing (missing value recovery) and future (future value prediction).
  3. Specify the paths of the configurations for the model and evaluation.

python main.py -cm ./configs/model.yml -cr ./configs/run_missing.yml -m eval -t missing

Citation

Please cite the following paper, if you find the repository or the paper useful.

Baoyu Jing, Hanghang Tong and Yada Zhu, Network of Tensor Time Series, WWW'2021

@article{jing2021network,
  title={Network of Tensor Time Series},
  author={Jing, Baoyu and Tong, Hanghang and Zhu, Yada},
  journal={arXiv preprint arXiv:2102.07736},
  year={2021}
}