Home

Awesome

ShaResNet

Reducing residual network parameter number by sharing weights

Paper

Link to the arxiv paper: https://arxiv.org/abs/1702.08782

Project

This work is part of the DeLTA (delta-onera.github.io) project at ONERA. This projects aims at developing innovative machine learning approaches for aerospace applications.

Architecture

alt text

The ShaResNets are residual networks sharing convolution involved in spatial connections. At a given stage the weights are the same for all 3x3 convolutions. The result is a lighter model (between 20% and 45% less parameters). Quantitative figures are given in the result section.

Results

Table of parameter number reduction using ShaResNets compared to the original ResNet. WRN stands for Wide Residual Networks (BMVC 2016, http://arxiv.org/abs/1605.07146 by Sergey Zagoruyko and Nikos Komodakis).

DatasetModelParam. Orig.Param. Share.Param dec.
CIFAR 10ResNet-1641.70 M0.93 M45%
WRN-40-48.95 M5.85 M35%
CIFAR 100WRN-28-1036.54 M26.86 M26%
IMAGENETResNet-3421.8 M13.6 M37%
ResNet-5025.6 M20.5 M20%
ResNet-10144.5 M29.4 M33%
ResNet-15260.2 M36.8 M39%

Table of accuracies on CIFAR 10 and 100 and ImageNet ILSVRC 2012 (validation set).

DatasetModelErr. top 1 Orig.Err. top 1 Share.Err. top 5 Orig.Err. top 5 Share
CIFAR 10ResNet-1645.46 %6.2 %
WRN-40-44.17 %5.1 %
CIFAR 100WRN-28-1020 %20.2 %
IMAGENETResNet-3426.73 %28.25 %8.74 %9.42 %
ResNet-5024.01 %24.61 %7.02 %7.41 %
ResNet-10122.44 %22.91 %6.21 %6.55 %
ResNet-15222.16 %22.23 %6.16 %6.14 %

Code

The experiments uses Torch7 with neural network package. We provide in this repository the model definition.

CIFAR

The model for CIFAR 10 and 100 have been trained using the original implementation of Wide Residual Networks at github.com/szagoruyko/wide-residual- networks.

ImageNet

The provided model for imagenet is to be trained using the code from Facebook at github.com/facebook/fb.resnet.torch.

Pre-trained models

The weights for ImageNet are available :