Home

Awesome

Network Slimming (Pytorch)

This repository contains an official pytorch implementation for the following paper
Learning Efficient Convolutional Networks Through Network Slimming (ICCV 2017).
Zhuang Liu, Jianguo Li, Zhiqiang Shen, Gao Huang, Shoumeng Yan, Changshui Zhang.

Original implementation: slimming in Torch.
The code is based on pytorch-slimming. We add support for ResNet and DenseNet.

Citation:

@InProceedings{Liu_2017_ICCV,
    author = {Liu, Zhuang and Li, Jianguo and Shen, Zhiqiang and Huang, Gao and Yan, Shoumeng and Zhang, Changshui},
    title = {Learning Efficient Convolutional Networks Through Network Slimming},
    booktitle = {The IEEE International Conference on Computer Vision (ICCV)},
    month = {Oct},
    year = {2017}
}

Dependencies

torch v0.3.1, torchvision v0.2.0

Channel Selection Layer

We introduce channel selection layer to help the pruning of ResNet and DenseNet. This layer is easy to implement. It stores a parameter indexes which is initialized to an all-1 vector. During pruning, it will set some places to 0 which correspond to the pruned channels.

Baseline

The dataset argument specifies which dataset to use: cifar10 or cifar100. The arch argument specifies the architecture to use: vgg,resnet or densenet. The depth is chosen to be the same as the networks used in the paper.

python main.py --dataset cifar10 --arch vgg --depth 19
python main.py --dataset cifar10 --arch resnet --depth 164
python main.py --dataset cifar10 --arch densenet --depth 40

Train with Sparsity

python main.py -sr --s 0.0001 --dataset cifar10 --arch vgg --depth 19
python main.py -sr --s 0.00001 --dataset cifar10 --arch resnet --depth 164
python main.py -sr --s 0.00001 --dataset cifar10 --arch densenet --depth 40

Prune

python vggprune.py --dataset cifar10 --depth 19 --percent 0.7 --model [PATH TO THE MODEL] --save [DIRECTORY TO STORE RESULT]
python resprune.py --dataset cifar10 --depth 164 --percent 0.4 --model [PATH TO THE MODEL] --save [DIRECTORY TO STORE RESULT]
python denseprune.py --dataset cifar10 --depth 40 --percent 0.4 --model [PATH TO THE MODEL] --save [DIRECTORY TO STORE RESULT]

The pruned model will be named pruned.pth.tar.

Fine-tune

python main.py --refine [PATH TO THE PRUNED MODEL] --dataset cifar10 --arch vgg --depth 19 --epochs 160

Results

The results are fairly close to the original paper, whose results are produced by Torch. Note that due to different random seeds, there might be up to ~0.5%/1.5% fluctation on CIFAR-10/100 datasets in different runs, according to our experiences.

CIFAR10

CIFAR10-VggBaselineSparsity (1e-4)Prune (70%)Fine-tune-160(70%)
Top1 Accuracy (%)93.7793.3032.5493.78
Parameters20.04M20.04M2.25M2.25M
CIFAR10-Resnet-164BaselineSparsity (1e-5)Prune(40%)Fine-tune-160(40%)Prune(60%)Fine-tune-160(60%)
Top1 Accuracy (%)94.7594.7694.5895.0547.7393.81
Parameters1.71M1.73M1.45M1.45M1.12M1.12M
CIFAR10-Densenet-40BaselineSparsity (1e-5)Prune (40%)Fine-tune-160(40%)Prune(60%)Fine-tune-160(60%)
Top1 Accuracy (%)94.1194.1794.1694.3289.4694.22
Parameters1.07M1.07M0.69M0.69M0.49M0.49M

CIFAR100

CIFAR100-VggBaselineSparsity (1e-4)Prune (50%)Fine-tune-160(50%)
Top1 Accuracy (%)72.1272.055.3173.32
Parameters20.04M20.04M4.93M4.93M
CIFAR100-Resnet-164BaselineSparsity (1e-5)Prune (40%)Fine-tune-160(40%)Prune(60%)Fine-tune-160(60%)
Top1 Accuracy (%)76.7976.8748.077.36------
Parameters1.73M1.73M1.49M1.49M------

Note: For results of pruning 60% of the channels for resnet164-cifar100, in this implementation, sometimes some layers are all pruned and there would be error. However, we also provide a mask implementation where we apply a mask to the scaling factor in BN layer. For mask implementaion, when pruning 60% of the channels in resnet164-cifar100, we can also train the pruned network.

CIFAR100-Densenet-40BaselineSparsity (1e-5)Prune (40%)Fine-tune-160(40%)Prune(60%)Fine-tune-160(60%)
Top1 Accuracy (%)73.2773.2967.6773.7619.1873.19
Parameters1.10M1.10M0.71M0.71M0.50M0.50M

Contact

sunmj15 at gmail.com liuzhuangthu at gmail.com