Home

Awesome

SENet.pytorch

An implementation of SENet, proposed in Squeeze-and-Excitation Networks by Jie Hu, Li Shen and Gang Sun, who are the winners of ILSVRC 2017 classification competition.

Now SE-ResNet (18, 34, 50, 101, 152/20, 32) and SE-Inception-v3 are implemented.

For SE-Inception-v3, the input size is required to be 299x299 as the original Inception.

Pre-requirements

The codebase is tested on the following setting.

For training

To run cifar.py or imagenet.py, you need

hub

You can use some SE-ResNet (se_resnet{20, 56, 50, 101}) via torch.hub.

import torch.hub
hub_model = torch.hub.load(
    'moskomule/senet.pytorch',
    'se_resnet20',
    num_classes=10)

Also, a pretrained SE-ResNet50 model is available.

import torch.hub
hub_model = torch.hub.load(
    'moskomule/senet.pytorch',
    'se_resnet50',
    pretrained=True,)

Results

SE-ResNet20/Cifar10

python cifar.py [--baseline]

Note that the CIFAR-10 dataset expected to be under ~/.torch/data.

ResNet20SE-ResNet20 (reduction 4 or 8)
max. test accuracy92%93%

SE-ResNet50/ImageNet

python [-m torch.distributed.launch --nproc_per_node=${NUM_GPUS}] imagenet.py

The option [-m ...] is for distributed training. Note that the Imagenet dataset is expected to be under ~/.torch/data or specified as IMAGENET_ROOT=${PATH_TO_IMAGENET}.

The initial learning rate and mini-batch size are different from the original version because of my computational resource .

ResNetSE-ResNet
max. test accuracy(top1)76.15 %(*)77.06% (**)
# !wget https://github.com/moskomule/senet.pytorch/releases/download/archive/seresnet50-60a8950a85b2b.pkl

senet = se_resnet50(num_classes=1000)
senet.load_state_dict(torch.load("seresnet50-60a8950a85b2b.pkl"))

Contribution

I cannot maintain this repository actively, but any contributions are welcome. Feel free to send PRs and issues.

References

paper

authors' Caffe implementation