Home

Awesome

Social-SSL

This is the official implementation of our paper:

Social-SSL: Self-Supervised Cross-Sequence Representation Learning Based on Transformers for Multi-Agent Trajectory Prediction
Paper(PDF)
Supplementary Materials(PDF)
Li-Wu Tsao, Yan-Kai Wang, Hao-Siang Lin, Hong-Han Shuai, Lai-Kuan Wong, Wen-Huang Cheng

Environment

Preprocessing & Datasets

The preprocessed version of ETH/UCY dataset can be download here.

More details on preprocessing and the tools for converting from raw datasets <br> to our version can be found in the utils/dataset_convertor/ folder.

Training pretext

$python train_pretext.py

Finetune on downstream (Trajectory Prediction)

Please check the details in our paper, which is also noticed by an issue
$python finetune.py

Evaluation

$python eval.py

Qualitative Result

Citation

If you find our work is relevant to your research, please cite:

@inproceedings{tsao2022social,
  title={Social-SSL: Self-supervised Cross-Sequence Representation Learning Based on Transformers for Multi-agent Trajectory Prediction},
  author={Tsao, Li-Wu and Wang, Yan-Kai and Lin, Hao-Siang and Shuai, Hong-Han and Wong, Lai-Kuan and Cheng, Wen-Huang},
  booktitle={European Conference on Computer Vision},
  pages={234--250},
  year={2022},
  organization={Springer}
}