Home

Awesome

K-Net: Towards Unified Image Segmentation

PWC

Introduction

This is an official release of the paper K-Net:Towards Unified Image Segmentation. K-Net will also be integrated in the future release of MMDetection and MMSegmentation.

K-Net:Towards Unified Image Segmentation,
Wenwei Zhang, Jiangmiao Pang, Kai Chen, Chen Change Loy
In: Proc. Advances in Neural Information Processing Systems (NeurIPS), 2021
[arXiv][project page][Bibetex]

Results

The results of K-Net and their corresponding configs on each segmentation task are shown as below. We have released the full model zoo of panoptic segmentation. The complete model checkpoints and logs for instance and semantic segmentation will be released soon.

Semantic Segmentation on ADE20K

BackboneMethodCrop SizeLr SchdmIoUConfigDownload
R-50K-Net + FCN512x51280K43.3configmodel | log
R-50K-Net + PSPNet512x51280K43.9configmodel | log
R-50K-Net + DeepLabv3512x51280K44.6configmodel | log
R-50K-Net + UPerNet512x51280K43.6configmodel | log
Swin-TK-Net + UPerNet512x51280K45.4configmodel | log
Swin-LK-Net + UPerNet512x51280K52.0configmodel | log
Swin-LK-Net + UPerNet640x64080K52.7configmodel | log

Instance Segmentation on COCO

BackboneMethodLr SchdMask mAPConfigDownload
R-50K-Net1x34.0configmodel | log
R-50K-Netms-3x37.8configmodel | log
R-101K-Netms-3x39.2configmodel | log
R-101-DCNK-Netms-3x40.5configmodel | log

Panoptic Segmentation on COCO

BackboneMethodLr SchdPQConfigDownload
R-50K-Net1x44.3configmodel | log
R-50K-Netms-3x47.1configmodel | log
R-101K-Netms-3x48.4configmodel | log
R-101-DCNK-Netms-3x49.6configmodel | log
Swin-L (window size 7)K-Netms-3x54.6configmodel | log
Above on test-dev55.2

Installation

It requires the following OpenMMLab packages:

pip install openmim scipy mmdet mmsegmentation
pip install git+https://github.com/cocodataset/panopticapi.git
mim install mmcv-full

License

This project is released under the Apache 2.0 license.

Usage

Data preparation

Prepare data following MMDetection and MMSegmentation. The data structure looks like below:

data/
├── ade
│   ├── ADEChallengeData2016
│   │   ├── annotations
│   │   ├── images
├── coco
│   ├── annotations
│   │   ├── panoptic_{train,val}2017.json
│   │   ├── instance_{train,val}2017.json
│   │   ├── panoptic_{train,val}2017/  # panoptic png annotations
│   │   ├── image_info_test-dev2017.json  # for test-dev submissions
│   ├── train2017
│   ├── val2017
│   ├── test2017

Training and testing

For training and testing, you can directly use mim to train and test the model

# train instance/panoptic segmentation models
sh ./tools/mim_slurm_train.sh $PARTITION mmdet $CONFIG $WORK_DIR

# test instance segmentation models
sh ./tools/mim_slurm_test.sh $PARTITION mmdet $CONFIG $CHECKPOINT --eval segm

# test panoptic segmentation models
sh ./tools/mim_slurm_test.sh $PARTITION mmdet $CONFIG $CHECKPOINT --eval pq

# train semantic segmentation models
sh ./tools/mim_slurm_train.sh $PARTITION mmseg $CONFIG $WORK_DIR

# test semantic segmentation models
sh ./tools/mim_slurm_test.sh $PARTITION mmseg $CONFIG $CHECKPOINT --eval mIoU

For test submission for panoptic segmentation, you can use the command below:

# we should update the category information in the original image test-dev pkl file
# for panoptic segmentation
python -u tools/gen_panoptic_test_info.py
# run test-dev submission
sh ./tools/mim_slurm_test.sh $PARTITION mmdet $CONFIG $CHECKPOINT  --format-only --cfg-options data.test.ann_file=data/coco/annotations/panoptic_image_info_test-dev2017.json data.test.img_prefix=data/coco/test2017 --eval-options jsonfile_prefix=$WORK_DIR

You can also run training and testing without slurm by directly using mim for instance/semantic/panoptic segmentation like below:

PYTHONPATH='.':$PYTHONPATH mim train mmdet $CONFIG $WORK_DIR
PYTHONPATH='.':$PYTHONPATH mim train mmseg $CONFIG $WORK_DIR

Citation

@inproceedings{zhang2021knet,
    title={{K-Net: Towards} Unified Image Segmentation},
    author={Wenwei Zhang and Jiangmiao Pang and Kai Chen and Chen Change Loy},
    year={2021},
    booktitle={NeurIPS},
}