Awesome
WS-SAM
Weakly-Supervised Concealed Object Segmentation with SAM-based Pseudo Labeling and Multi-scale Feature Grouping, NeurIPS 2023
[Paper] [Supplementary material][Results] [Pretrained models]
Authors
Chunming He^, Kai Li^, Yachao Zhang, Guoxia Xu, Longxiang Tang, Yulun Zhang, Zhenhua Guo, Xiu Li*
Abstract: Weakly-Supervised Concealed Object Segmentation (WSCOS) aims to segment objects well blended with surrounding environments using sparsely-annotated data for model training. It remains a challenging task since (1) it is hard to distinguish concealed objects from the background due to the intrinsic similarity and (2) the sparsely-annotated training data only provide weak supervision for model learning. In this paper, we propose a new WSCOS method to address these two challenges. To tackle the intrinsic similarity challenge, we design a multi-scale feature grouping module that first groups features at different granularities and then aggregates these grouping results. By grouping similar features together, it encourages segmentation coherence, helping obtain complete segmentation results for both single and multiple-object images. For the weak supervision challenge, we utilize the recently-proposed vision foundation model, ``\emph{Segment Anything Model (SAM)}'', and use the provided sparse annotations as prompts to generate segmentation masks, which are used to train the model. To alleviate the impact of low-quality segmentation masks, we further propose a series of strategies, including multi-augmentation result ensemble, entropy-based pixel-level weighting, and entropy-based image-level selection. These strategies help provide more reliable supervision to train the segmentation model. We verify the effectiveness of our method on various WSCOS tasks, and experiments demonstrate that our method achieves state-of-the-art performance on these tasks.
<p align="center"> <img width="1000" src="Framework.jpg"> </p>
1. Prerequisites
Note that WS-SAM is only tested on Ubuntu OS with the following environments.
- Creating a virtual environment in the terminal:
conda create -n FEDER python=3.8
. - Installing necessary packages:
pip install -r requirements.txt
2. Downloading Training and Testing Datasets
-
Download the training pseudo mask generated by our WS-SAM, where the extraction code is 6666. We provide all pseudo masks to ensure completeness. Feel free to filter out those low-quality masks in the training phase.
-
You can find useful training and testing datasets in this repository.
3. Training Configuration
- The pretrained model is stored in Google Drive. After downloading, please change the file path in the corresponding code.
python Train.py --epoch 160 --lr 1e-4 --batchsize 36 --trainsize 36 --train_root YOUR_TRAININGSETPATH --val_root YOUR_VALIDATIONSETPATH --save_path YOUR_CHECKPOINTPATH
4. Testing Configuration
Our well-trained model is stored in Baidu Yun. After downloading, please change the file path in the corresponding code.
python Test.py --testsize YOUR_IMAGESIZE --pth_path YOUR_CHECKPOINTPATH --test_dataset_path YOUR_TESTINGSETPATH
5. Evaluation
- Matlab code: One-key evaluation is written in MATLAB code, please follow this the instructions in
main.m
and just run it to generate the evaluation results.
6. Results download
The prediction results of our WS-SAM are stored on Baidu Yun please check.
Related Works
Camouflaged object detection with feature decomposition and edge reconstruction, NeurIPS 2023.
Segment Anything, ICCV 2023.
Concealed Object Detection, TPAMI 2022.
You can see more related papers in Awesome-COS.
📎 Citation
If you find the code helpful in your research or work, please cite the following paper(s).
@article{he2023weaklysupervised,
title={Weakly-Supervised Concealed Object Segmentation with SAM-based Pseudo Labeling and Multi-scale Feature Grouping},
author={He, Chunming and Li, Kai and Zhang, Yachao and Xu, Guoxia and Tang, Longxiang and Zhang, Yulun and Guo, Zhenhua and Li, Xiu},
journal={NeurIPS},
year={2023}
}
Concat
If you have any questions, please feel free to contact me via email at chunminghe19990224@gmail.com or hcm21@mails.tsinghua.edu.cn.
💡 Acknowledgements
The codes are based on FEDER and SINet V2. Please also follow their licenses. Thanks for the awesome work.