Awesome
Knowledge Distillation with Adversarial Samples Supporting Decision Boundary
Official Pytorch implementation of paper:
Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019).
Sporlight and poster are available on homepage
Environment
Python 3.6, Pytorch 0.4.1, Torchvision
Knowledge distillation (CIFAR-10)
python train_BSS_distillation.py
Distillation from ResNet 26 (teacher) to ResNet 10 (student) on CIFAR-10 dataset.
Pre-trained teacher network (ResNet 26) is included.
Citation
@inproceedings{BSSdistill,
title = {Knowledge Distillation with Adversarial Samples Supporting Decision Boundary},
author = {Byeongho Heo, Minsik Lee, Sangdoo Yun, Jin Young Choi},
booktitle = {AAAI Conference on Artificial Intelligence (AAAI)},
year = {2019}
}