Home

Awesome

SparseTT

The official implementation for paper "SparseTT: Visual Tracking with Sparse Transformers".

This paper is accepted by IJCAI2022 as a long oral presentation.

Installation

conda create -n SparseTT python=3.7 -y
conda activate SparseTT
conda install pytorch==1.10.0 torchvision==0.11.0 torchaudio==0.10.0 cudatoolkit=11.3 -c pytorch -c conda-forge
# pytorch version: >= 1.9.0 
pip install -r requirements.txt

Test

├── SparseTT
|   ├── ...
|   ├── ...
|   ├── datasets
|   |   ├── COCO -> /opt/data/COCO
|   |   ├── GOT-10k -> /opt/data/GOT-10k
|   |   ├── ILSVRC2015 -> /opt/data/ILSVRC2015
|   |   ├── LaSOT -> /opt/data/LaSOT/LaSOTBenchmark
|   |   ├── OTB
|   |   |   └── OTB2015 -> /opt/data/OTB2015
|   |   ├── TrackingNet -> /opt/data/TrackingNet
|   |   ├── UAV123 -> /opt/data/UAV123/UAV123

i. Star notation(*): just for training. You can ignore these datasets if you just want to test the tracker.

ii. In this case, we create soft links for every dataset. The real storage location of all datasets is /opt/data/. You can change them according to your situation.

GOT-10k

python main/test.py --config experiments/sparsett/test/got10k/sparsett_swin_got10k.yaml

LaSOT

python main/test.py --config experiments/sparsett/test/lasot/sparsett_swin_lasot.yaml

TrackingNet

python main/test.py --config experiments/sparsett/test/trackingnet/sparsett_swin_trackingnet.yaml

UAV123

python main/test.py --config experiments/sparsett/test/uav123/sparsett_swin_uav123.yaml

OTB2015

python main/test.py --config experiments/sparsett/test/otb2015/sparsett_swin_otb2015.yaml

Training

GOT-10k

python main/train.py --config experiments/sparsett/train/got10k/sparsett_swin_train_got10k.yaml

fulldata

python main/train.py --config experiments/sparsett/train/fulldata/sparsett_swin_train_fulldata.yaml

Testing Results

Click here to download all testing results that includes:

Acknowledgement

Repository

This repository is built on the top of the single object tracking framework video_analyst. See it for more instructions and details.

References

@article{fu2022sparsett,
  title={SparseTT: Visual Tracking with Sparse Transformers},
  author={Fu, Zhihong and Fu, Zehua and Liu, Qingjie and Cai, Wenrui and Wang, Yunhong},
  booktitle={IJCAI},
  year={2022}
}

Contact

If you have any questions, just create issues or email me:smile:.