Awesome
Compressed networks from ENC (Caffe model)
Fine-tuned network models by "Efficient Neural Network Compression", CVPR 2019.
For the source-code of paper, please refer to [ENC]
- This repository contains the prototxt files
- Trained models: [driver]
AlexNet with ImageNet
Method | FLOPs | Weights | Top-1 Acc. | Top-5 Acc. |
---|
[ENC-Inf] | 37.5% | 18.0% | 56.74% | 80.14% |
[ENC-Model] | 37.5% | 18.0% | 56.71% | 80.13% |
Method | FLOPs | Top-1 Acc. | Top-5 Acc. |
---|
[ENC-Inf] | 31% | 56.66% | 79.74% |
VGG-16 with ImageNet
Method | FLOPs | Top-1 Acc. | Top-5 Acc. |
---|
[ENC-Model] | 20% | 71.06% | 89.95% |
Method | FLOPs | Top-1 Acc. | Top-5 Acc. |
---|
[ENC-Model] | 24% | 70.95% | 89.95% |
ResNet-56 with Cifar10
Citation
@CONFERENCE{ENC_CVPR19,
author={Hyeji Kim, Muhammad Umar Karim Khan, Chong-Min Kyung},
title={Efficient Neural Network Compression},
booktitle={CVPR},
month = {June},
year = {2019},
}