Home

Awesome

<div align=center><img src=".github/mdistiller.png" width="40%" ><div align=left>

This repo is

(1) a PyTorch library that provides classical knowledge distillation algorithms on mainstream CV benchmarks,

(2) the official implementation of the CVPR-2022 paper: Decoupled Knowledge Distillation.

(3) the official implementation of the ICCV-2023 paper: DOT: A Distillation-Oriented Trainer.

DOT: A Distillation-Oriented Trainer

Framework

<div style="text-align:center"><img src=".github/dot.png" width="80%" ></div>

Main Benchmark Results

On CIFAR-100:

Teacher <br> StudentResNet32x4 <br> ResNet8x4VGG13 <br> VGG8ResNet32x4 <br> ShuffleNet-V2
KD73.3372.9874.45
KD+DOT75.1273.7775.55

On Tiny-ImageNet:

Teacher <br> StudentResNet18 <br> MobileNet-V2ResNet18 <br> ShuffleNet-V2
KD58.3562.26
KD+DOT64.0165.75

On ImageNet:

Teacher <br> StudentResNet34 <br> ResNet18ResNet50 <br> MobileNet-V1
KD71.0370.50
KD+DOT71.7273.09

Decoupled Knowledge Distillation

Framework & Performance

<div style="text-align:center"><img src=".github/dkd.png" width="80%" ></div>

Main Benchmark Results

On CIFAR-100:

Teacher <br> StudentResNet56 <br> ResNet20ResNet110 <br> ResNet32ResNet32x4 <br> ResNet8x4WRN-40-2 <br> WRN-16-2WRN-40-2 <br> WRN-40-1VGG13 <br> VGG8
KD70.6673.0873.3374.9273.5472.98
DKD71.9774.1176.3276.2374.8174.68
Teacher <br> StudentResNet32x4 <br> ShuffleNet-V1WRN-40-2 <br> ShuffleNet-V1VGG13 <br> MobileNet-V2ResNet50 <br> MobileNet-V2ResNet32x4 <br> MobileNet-V2
KD74.0774.8367.3767.3574.45
DKD76.4576.7069.7170.3577.07

On ImageNet:

Teacher <br> StudentResNet34 <br> ResNet18ResNet50 <br> MobileNet-V1
KD71.0370.50
DKD71.7072.05

MDistiller

Introduction

MDistiller supports the following distillation methods on CIFAR-100, ImageNet and MS-COCO:

MethodPaper LinkCIFAR-100ImageNetMS-COCO
KDhttps://arxiv.org/abs/1503.02531
FitNethttps://arxiv.org/abs/1412.6550
AThttps://arxiv.org/abs/1612.03928
NSThttps://arxiv.org/abs/1707.01219
PKThttps://arxiv.org/abs/1803.10837
KDSVDhttps://arxiv.org/abs/1807.06819
OFDhttps://arxiv.org/abs/1904.01866
RKDhttps://arxiv.org/abs/1904.05068
VIDhttps://arxiv.org/abs/1904.05835
SPhttps://arxiv.org/abs/1907.09682
CRDhttps://arxiv.org/abs/1910.10699
ReviewKDhttps://arxiv.org/abs/2104.09044
DKDhttps://arxiv.org/abs/2203.08679

Installation

Environments:

Install the package:

sudo pip3 install -r requirements.txt
sudo python3 setup.py develop

Getting started

  1. Wandb as the logger
  1. Evaluation
  1. Training on CIFAR-100
  1. Training on ImageNet
  1. Training on MS-COCO
  1. Extension: Visualizations

Custom Distillation Method

  1. create a python file at mdistiller/distillers/ and define the distiller
from ._base import Distiller

class MyDistiller(Distiller):
    def __init__(self, student, teacher, cfg):
        super(MyDistiller, self).__init__(student, teacher)
        self.hyper1 = cfg.MyDistiller.hyper1
        ...

    def forward_train(self, image, target, **kwargs):
        # return the output logits and a Dict of losses
        ...
    # rewrite the get_learnable_parameters function if there are more nn modules for distillation.
    # rewrite the get_extra_parameters if you want to obtain the extra cost.
  ...
  1. regist the distiller in distiller_dict at mdistiller/distillers/__init__.py

  2. regist the corresponding hyper-parameters at mdistiller/engines/cfg.py

  3. create a new config file and test it.

Citation

If this repo is helpful for your research, please consider citing the paper:

@article{zhao2022dkd,
  title={Decoupled Knowledge Distillation},
  author={Zhao, Borui and Cui, Quan and Song, Renjie and Qiu, Yiyu and Liang, Jiajun},
  journal={arXiv preprint arXiv:2203.08679},
  year={2022}
}
@article{zhao2023dot,
  title={DOT: A Distillation-Oriented Trainer},
  author={Zhao, Borui and Cui, Quan and Song, Renjie and Liang, Jiajun},
  journal={arXiv preprint arXiv:2307.08436},
  year={2023}
}

License

MDistiller is released under the MIT license. See LICENSE for details.

Acknowledgement