Home

Awesome

Single-Stage Rotation-Decoupled Detector for Oriented Object

This is the repository of paper Single-Stage Rotation-Decoupled Detector for Oriented Object. [Paper] [PDF]

Update: Updated the code for training on the DOTA, HRSC2016 and UCAS-AOD datasets. Uploaded the weights trained on these datasets.

<img src="demo/graphical-abstract.png" alt="Graphical Abstract" style="zoom: 50%;" />

Introduction

We optimized the anchor-based oriented object detection method by decoupling the matching of the oriented bounding box and the oriented anchor into the matching of the horizontal bounding box and the horizontal anchor.

Performance

DOTA1.0 (Task1)

Reported in our paper:

backboneMSmAPPLBDBRGTFSVLVSHTCBCSTSBFRAHASPHC
ResNet101×75.5289.784.3346.3568.6273.8973.1986.9290.4186.4684.364.2264.9573.5572.5973.31
ResNet10177.7589.1583.9252.5173.0677.817987.0890.6286.7287.1563.9670.2976.9875.7972.15

Retested with the original weights and the newly released code:

backboneMSmAPPLBDBRGTFSVLVSHTCBCSTSBFRAHASPHC
ResNet101×75.0289.6182.0143.3564.7974.1077.5487.1190.8487.1584.8061.5262.2274.4972.5773.13
ResNet10177.8789.2184.8053.4073.1778.1179.4487.2890.7886.4687.4363.4669.9177.5276.0071.06

Checkpoint:

HRSC2016

Reported in our paper:

backboneAP(12)
ResNet10194.29
ResNet15294.61

*****Updated the test results obtained using the VOC 07 11 point method. Retested with the original weights and the newly released code:

backboneAP(12)AP(07)
ResNet10194.2688.19
ResNet15294.7189.00

07 or 12 means use the VOC 07 or VOC 12 evaluation metric.

Checkpoint:

UCAS-AOD

Reported in our paper:

backboneplanecarmAP
ResNet10198.8694.9696.86
ResNet15298.8595.1897.01

Retested with the original weights and the newly released code:

backboneplanecarmAP
ResNet10198.8694.9696.91
ResNet15298.9395.1497.03

Checkpoint:

Visualization

Result

Run

Requirements

tqdm
numpy
pillow
cython
beautifulsoup4
opnecv-python
pytorch>=1.2
torchvision>=0.4
tensorboard>=2.2

Compile

# 'rbbox_batched_nms' will be used as post-processing in the interface stage
# use gpu, for Linux only
cd $PATH_ROOT/utils/box/ext/rbbox_overlap_gpu
python setup.py build_ext --inplace

# alternative, use cpu, for Windows and Linux
cd $PATH_ROOT/utils/box/ext/rbbox_overlap_cpu
python setup.py build_ext --inplace

Pre-training Weight

Download pretrained weight files.

Modify the DIR_WEIGHT defined in config/__init__.py to be the directory where the weight files are placed.

DIR_WEIGHT = /.../pre-training-weights

Train on DOTA

Data Preprocessing

Download the DOTA dataset, and move files like:

$PATH_ROOT/images
----------/labelTxt-v1.0-obb
$PATH_ROOT/images/train/P0000.png
-----------------/train/...
-----------------/val/...
-----------------/test/...

$PATH_ROOT/labelTxt/train/P0000.txt
-------------------/train/...
-------------------/val/...

Modify dir_dataset and dir_dataset defined in run/dota/prepare.py, run/dota/train.py, run/dota/evaluate.py to the local path.

dir_dataset = '/.../PATH_ROOT'  # The directory where the dataset is located
dir_save = '...'                # Output directory

Then run the provided code:

REPO_ROOT$ python run/dota/prepare.py

Start Training

REPO_ROOT$ python run/dota/train.py

Evaluate

REPO_ROOT$ python run/dota/evaluate.py

Train on HRSC2016

Similar to the steps on the DOTA dataset, the code is provided in run/hrsc2016.

Train on UCAS-AOD

Similar to the steps on the DOTA dataset, the code is provided in run/ucas-aod.

To Do

Update the code used for detection.

Citation

@article{rdd,
    title={Single-Stage Rotation-Decoupled Detector for Oriented Object},
    author={Zhong, Bo and Ao, Kai},
    journal={Remote Sensing},
    year={2020}
}