Home

Awesome

Q-DETR: An Efficient Low-Bit Quantized Detection Transformer

Usage

First, clone the repository locally:

git clone https://github.com/facebookresearch/detr.git

Then, install PyTorch 1.5+ and torchvision 0.6+:

conda install -c pytorch pytorch torchvision

Install pycocotools (for evaluation on COCO) and scipy (for training):

conda install cython scipy
pip install -U 'git+https://github.com/cocodataset/cocoapi.git#subdirectory=PythonAPI'

That's it, should be good to train and evaluate detection models.

(optional) to work with panoptic install panopticapi:

pip install git+https://github.com/cocodataset/panopticapi.git

Data preparation

Download and extract COCO 2017 train and val images with annotations from http://cocodataset.org. We expect the directory structure to be the following:

path/to/coco/
  annotations/  # annotation json files
  train2017/    # train images
  val2017/      # val images

Evaluation

Modifications in the script evaluate_coco.sh:

bash evaluate_coco.sh

Model Zoo

MethodsBit-widthEpochbox APModel Link
Real-valued)32-bit5041.0-
Q-DETR4-bit5038.5Model
Q-DETR2-bit5032.4Model

Citation

If you find this repository useful, please consider citing our work:

@inproceedings{xu2023q,
  title={Q-DETR: An Efficient Low-Bit Quantized Detection Transformer},
  author={Xu, Sheng and Li, Yanjing and Lin, Mingbao and Gao, Peng and Guo, Guodong and L{\"u}, Jinhu and Zhang, Baochang},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={3842--3851},
  year={2023}
}

Acknowledege

The project are borrowed heavily from SMCA-DETR.