Home

Awesome

Zero-Shot Knowledge Distillation in Deep Networks Pytorch

Python version support PyTorch version support

:star: Star us on GitHub — it helps!!

PyTorch implementation for Zero-Shot Knowledge Distillation in Deep Networks

Install

You will need a machine with a GPU and CUDA installed.
Then, you prepare runtime environment:

pip install -r requirements.txt

Use

For mnist dataset,

python main.py --dataset=mnist --t_train=False --num_sample=12000 --batch_size=200 

For cifar10 dataset,

python main.py --dataset=cifar10 --t_train=False --num_sample=24000 --batch_size=100

Arguments:

Result examples for MNIST dataset

2

Understanding this method(algorithm)

:white_check_mark: Check my blog!! Here