Home

Awesome

<img src="/img/UAM_approaching.gif" width="800" alt="" />

Official code for our work on UAM object tracking:

:bust_in_silhouette: Guangze Zheng, Changhong Fu*, Junjie Ye, Bowen Li, Geng Lu, and Jia Pan

1. Introduction

SiamSA aims to provide a model-free solution for UAM tracking during approaching the object (for manipulation). Since the Scale Variation issue has been more crucial than general object-tracking scenes, the novel scale awareness is proposed with powerful attention methods.

Please refer to our project page, papers, dataset, and videos for more details.

:newspaper:[Project page] :page_facing_up:[TII Paper] :page_facing_up:[IROS Paper] :books:[UAM Tracking Dataset] :movie_camera: [TII Demo] :movie_camera: [IROS Presentation]

2. UAMT100&UAMT20L benchmark

2.1 Introduction

2.2 Scale variation difference between UAV and UAM tracking

<img src="/img/SV.png" width="540" alt="" />

A larger area under the curve means a higher frequency of object SV. It is clear that SV of UAM tracking is much more common and severe than UAV tracking.

2.3 Download and evaluation

3. Get started!

3.1 Environmental Setup

This code has been tested on Ubuntu 18.04, Python 3.8.3, Pytorch 1.6.0, CUDA 10.2. Please install related libraries before running this code:

git clone https://github.com/vision4robotics/SiamSA
pip install -r requirements.txt

3.2 Test

python tools/test.py 	                    \
	--trackername SiamSA                   \ # tracker_name
	--dataset UAMT100                       \ # dataset_name
	--snapshot snapshot/model.pth             # model_path

The testing result will be saved in the results/dataset_name/tracker_name directory.

3.3 Evaluate

If you want to evaluate the tracker mentioned above, please put those results into results directory.

python eval.py 	                      \
	--tracker_path ./results          \ # result path
	--dataset UAMT100                 \ # dataset_name
	--tracker_prefix 'model'            # tracker_name

3.4 Train

4. Cite SiamSA and UAM tracking benchmark

If you find SiamSA and UAM tracking useful, please cite our work by using the following BibTeX entry:

@inproceedings{SiamSA2022IROS,
 title={{Siamese Object Tracking for Vision-Based UAM Approaching with Pairwise Scale-Channel  Attention}},
 author={Zheng, Guangze and Fu, Changhong and Ye, Junjie and Li, Bowen and Lu, Geng and Pan, Jia},
 booktitle={Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
 pages={10486-10492},
 year={2022}
}
@article {SiamSA2023TII,
 title ={{Scale-Aware Siamese Object Tracking for Vision-Based UAM Approaching}},
 journal = {IEEE Transactions on Industrial Informatics},
 year = {2023},
 author = {Zheng, Guangze and Fu, Changhong and Ye, Junjie and Li, Bowen and Lu, Geng and Pan, Jia},
 pages = {1-12}
}

Contact

If you have any questions, don't hesitate to get in touch with me.

Guangze Zheng

Email: mmlp@tongji.edu.cn

Homepage: Guangze Zheng (george-zhuang.github.io)

Acknowledgement