Home

Awesome

<div align="center">

[CVPR 2023] ARKitTrack: A New Diverse Dataset for Tracking Using Mobile RGB-D Data

Project Page | arXiv

teaser

</div>

This is a PyTorch implementation of the paper ARKitTrack: A New Diverse Dataset for Tracking Using Mobile RGB-D Data. Code will be released here.

<div class="is-size-5 publication-authors"> <span class="author-block"> <a href="https://scholar.google.com/citations?hl=en&user=rk1ozXMAAAAJ">Haojie Zhao</a><sup>*1</sup>,</span> <span class="author-block"> <a href="https://scholar.google.com/citations?hl=en&user=p4zxPP8AAAAJ">Junsong Chen</a><sup>*1</sup>,</span> <span class="author-block"> <a href="http://faculty.dlut.edu.cn/wanglj/zh_CN/index.htm">Lijun Wang</a><sup>1</sup>, </span> <span class="author-block"> <a href="https://scholar.google.com/citations?hl=en&user=D3nE0agAAAAJ">Huchuan Lu</a><sup>1,2</sup> </span> (* indicates equal contributions) </div> <div class="is-size-5 publication-authors"> <span class="author-block"><sup>1</sup>Dalian University of Technology, China,</span> <span class="author-block"><sup>2</sup>Peng Cheng Laboratory, China</span> </div> Contact at: jschen@mail.dlut.edu.cn, haojie_zhao@mail.dlut.edu.cn

News


Dataset


1. Installation

# 1. Clone this repo
git clone https://github.com/lawrence-cj/ARKitTrack.git
cd ARKitTrack

# 2. Create conda env
conda env create -f art_env.yml
conda activate art

# 3. Install mmcv-full, mmdet, mmdet3d for the BEV pooling, which is from bevfusion.
pip install openmim
mim install mmcv-full==1.4.0
mim install mmdet==2.20.0
python setup.py develop  # mmdet3d

2. Set project paths

Run the following command to set paths for this project.

python tracking/create_default_local_file.py --workspace_dir . --data_dir ./data --save_dir ./output

After running this command, you can also modify paths by editing these two files: lib/train/admin/local.py and lib/test/evaluation/local.py.

3. Evaluation

Download our trained models from Google Drive and uncompress them to output/checkpoints/.

Change the corresponding dataset paths in lib/test/evaluation/local.py.

Run the following command to test on different datasets.

python tracking/test.py --tracker art --param vitb_384_mae_ce_32x4_ep300 --dataset depthtrack --threads 2 --num_gpus 2

The raw results are stored in Google Drive.

4. Training

Download the pre-trained weights from Google Drive and uncompress it to pretrained_models/.

Change the corresponding dataset paths in lib/train/admin/local.py.

Run the following command to train for vot.

python tracking/train.py --script art --config vitb_384_mae_ce_32x4_ep300 --save_dir ./output --mode multiple --nproc_per_node 2

Acknowledgments

Thanks for the OSTrack and BEVFusion projects, which help us to quickly implement our ideas.

Citation

@InProceedings{Zhao_2023_CVPR,
    author    = {Zhao, Haojie and Chen, Junsong and Wang, Lijun and Lu, Huchuan},
    title     = {ARKitTrack: A New Diverse Dataset for Tracking Using Mobile RGB-D Data},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month     = {June},
    year      = {2023},
    pages     = {5126-5135}
}

License

This project is under the MIT license. See LICENSE for details.