Home

Awesome

ADRNet

The official code and results for IJCV paper: Learning Adaptive Attribute-Driven Representation for Real-Time RGB-T Tracking

Framework

avatar

Experiments

Comparison with SOTA on GTOT and RGBT234.

TrackerGTOT(MSR/MPR)RGBT234(MSR/MPR)
ADRNet73.9/90.457.1/80.9
CAT(ECCV 20')71.7/88.956.1/80.4
MaCNet(Sensors 20')71.4/88.055.4/79.0
MANet(ICCVW 19')72.4/89.453.9/77.7
DAPNet(ACM MM 19')70.7/88.253.7/76.6

Comparison with VOT2019-RGBT competitors.

TrackerEAOAccR
JMMAC0.48260.66490.8211
ADRNet0.39590.62180.7567
SiamDW-T0.39250.61580.7839
mfDiMP0.38790.60190.8036
FSRPN0.35530.63620.7069
MANet0.34630.58230.7010
MPAT0.31800.57230.7242
CISRDCF0.29230.52150.6904
GESBTT0.28960.61630.6350

Get Started

Set up Anaconda environment

conda create -n ADRNet python=3.7
conda activate ADRNet
cd $Path_to_ADRNet$
bash install.sh

Run Demo sequence

cd $Path_to_ADRNet$
unzip demp.zip
python Run_test.py

Run RGBT234 and GTOT

cd $Path_to_ADRNet$
python Run_RGBT234.py
python Run_GTOT.py

Training

For training ADRB, you should generate attribute-specific data via

cd $Path_to_ADRNet/data_generation$
python generate_EI_GTOT.py
python generate_MB_GTOT.py
python generate_OCC_GTOT.py
python generate_TC_GTOT.py

Then, generate pkl files via,

cd $Path_to_ADRNet/modules$
python prepro_data_GTOT.py
python prepro_data_RGBT234.py

Finally, you can train the model after setting your data path,

cd $Path_to_ADRNet$
python train_ADRNet.py

Model zoo

The model can be found in google drive and baidu disk(code:56cu). After downloading, you should put it in $Path_to_ADRNet/models/$

Citation

If you feel our work is useful, please cite,

@article{Zhang_IJCV21_ADRNet,
author = {Pengyu Zhang and Dong Wang and Huchuan Lu and Xiaoyun Yang},
title = {Learning Adaptive Attribute-Driven Representation for Real-Time RGB-T Tracking},
journal = IJCV,
volume = {129},
pages = {2714–2729},
year = {2021}
}
If you have any questions, feel free to contract with me