Home

Awesome

I can find you! Boundary-guided Separated Attention Network for Camouflaged Object Detection (AAAI-22)

This repo is an official implementation of BSA-Net, which has been published in 36th IEEE Conference on Artificial Intelligence (AAAI-22).

Authors: Hongwei Zhu, Peng Li, Haoran Xie, Xuefeng Yan, Dong Liang, Dapeng Chen, Mingqiang Wei and Jing Qin

Overview

Intro

The main pipeline of our BSA-Net is shown as the following,

BSA-Net simulates the procedure of how humans to detect camouflaged objects. We adopt Res2Net as the backbone encoder. After capturing rich context information by the Residual Multi-scale Feature Extractor (RMFE), we design the Separated Attention (SEA) module to distinguish the subtle difference of foreground and background. The Boundary Guider (BG) module is included in the SEA module to strengthen the model’s ability to understand the boundary. Finally, we employ the Shuffle Attention (SHA) block and a feature fusion module to refine our COD result.

Result

Here's the experimental result.

Usage

Dependencies

Please refer to requirements.txt

Installing necessary packages: pip install -r requirements.txt.

Datasets

Once finished, please move the train/test dataset into ./Dataset/.

Train

After you download the train dataset, just run MyTrain.py. You can change the arguments to customize your preferred training environment settings. The trained model will be saved in ./Snapshot.

Test

Evaluation

We provide complete and fair one-key evaluation toolbox for benchmarking within a uniform standard. Please refer to this link for more information:

Copy the testing GT map (./Dataset/TestDataset/*/GT) ./evaluation/GT, run ./evaluation/evaluation.py, when the evaluation finished, it will save the metric results into ./result.txt.

Contact

If you have any questions, feel free to E-mail me via: zhuhongwei1999@gmail.com