Home

Awesome

Experiments on CIFAR datasets with PyTorch

Introduction

Reimplement state-of-the-art CNN models in cifar dataset with PyTorch, now including:

1.ResNet

2.PreActResNet

3.WideResNet

4.ResNeXt

5.DenseNet

other results will be added later.

Requirements:software

Requirements for PyTorch

Requirements:hardware

For most experiments, one or two K40(~11G of memory) gpus is enough cause PyTorch is very memory efficient. However, to train DenseNet on cifar(10 or 100), you need at least 4 K40 gpus.

Usage

  1. Clone this repository
git clone https://github.com/junyuseu/pytorch-cifar-models.git

In this project, the network structure is defined in the models folder, the script gen_mean_std.py is used to calculate the mean and standard deviation value of the dataset.

  1. Edit main.py and run.sh

In the main.py, you can specify the network you want to train(for example):

model = resnet20_cifar(num_classes=10)
...
fdir = 'result/resnet20_cifar10'

Then, you need specify some parameter for training in run.sh. For resnet20:

CUDA_VISIBLE_DEVICES=0 python main.py --epoch 160 --batch-size 128 --lr 0.1 --momentum 0.9 --wd 1e-4 -ct 10
  1. Train
nohup sh run.sh > resnet20_cifar10.log &

After training, the training log will be recorded in the .log file, the best model(on the test set) will be stored in the fdir.

Note:For first training, cifar10 or cifar100 dataset will be downloaded, so make sure your comuter is online. Otherwise, download the datasets and decompress them and put them in the data folder.

  1. Test
CUDA_VISIBLE_DEVICES=0 python main.py -e --resume=fdir/model_best.pth.tar
  1. CIFAR100

The default setting in the code is for cifar10, to train with cifar100, you need specify it explicitly in the code.

model = resnet20_cifar(num_classes=100)

Note: you should also change fdir In the run.sh, you should set -ct 100

Results

Note:The results as follow are got by only one single experiment.

We got comparable or even better results than the original papers, the experiment settings are totally follow the original ones

ResNet

layers#paramserror(%)
200.27M8.33
320.46M7.36
440.66M6.77
560.85M6.73
1101.7M6.13
120219.4M-

PreActResNet

datasetnetworkbaseline unitpre-activation unit
CIFAR-10ResNet-1106.136.13
CIFAR-10ResNet-1645.845.35
CIFAR-10ResNet-100111.275.13
CIFAR-100ResNet-16424.9924.50
CIFAR-100ResNet-100131.7324.03

WideResNet

depth-k#paramsCIFAR-10CIFAR-100
20-1026.8M4.2719.73
26-1036.5M3.8919.51

ResNeXt

network#paramsCIFAR-10CIFAR-100
ResNeXt-29,1x64d4.9M4.5122.09
ResNeXt-29,8x64d34.4M3.7817.44
ResNeXt-29,16x64d68.1M3.6917.11

DenseNet

networkdepth#paramsCIFAR-10CIFAR-100
DenseNet-BC(k=12)1000.8M4.6922.19
DenseNet-BC(k=24)25015.3M3.4417.17
DenseNet-BC(k=40)19025.6M3.4117.33

References:

[1] K. He, X. Zhang, S. Ren, and J. Sun. Deep residual learning for image recognition. In CVPR, 2016.

[2] K. He, X. Zhang, S. Ren, and J. Sun. Identity mappings in deep residual networks. In ECCV, 2016.

[3] S. Zagoruyko and N. Komodakis. Wide residual networks. In BMVC, 2016.

[4] S. Xie, G. Ross, P. Dollar, Z. Tu and K. He Aggregated residual transformations for deep neural networks. In CVPR, 2017

[5] H. Gao, Z. Liu, L. Maaten and K. Weinberger. Densely connected convolutional networks. In CVPR, 2017