Home

Awesome

IR-Net

This project is the PyTorch implementation of our accepted CVPR 2020 paper : forward and backward information retention for accurate binary neural networks. [PDF]

Bibtex:

@inproceedings{Qin:cvpr20,
  author    = {Haotong Qin and Ruihao Gong and Xianglong Liu and Mingzhu Shen and Ziran Wei and Fengwei Yu and Jingkuan Song},
  title     = {Forward and Backward Information Retention for Accurate Binary Neural Networks},
  booktitle = {IEEE CVPR},
  year      = {2020},
 }

IR-Net: We implement our IR-Net using Pytorch because of its high flexibility and powerful automatic differentiation mechanism. When constructing a binarized model, we simply replace the convolutional layers in the origin models with the binary convolutional layer binarized by our method.

Network Structures: We employ the widely-used network structures including VGG-Small, ResNet-20, ResNet-18 for CIFAR-10, and ResNet-18, ResNet-34 for ImageNet. To prove the versatility of our IR-Net, we evaluate it on both the normal structure and the Bi-Real structure of ResNet. All convolutional and fully-connected layers except the first and last one are binarized, and we select Hardtanh as our activation function instead of ReLU.

Initialization: Our IR-Net is trained from scratch (random initialization) without leveraging any pre-trained model. To evaluate our IR-Net on various network architectures, we mostly follow the hyper-parameter settings of their original papers. Among the experiments, we apply SGD as our optimization algorithm.

Dependencies

For the GPUs, we use a single NVIDIA GeForce 1080TI when training IR-Net on the CIFAR-10 dataset and 32 NVIDIA GeForce 1080TI when training IR-Net on the ImageNet dataset.

Accuracy:

​ CIFAR-10:

TopologyBit-Width (W/A)Accuracy (%)
ResNet-201 / 186.5
ResNet-201 / 3290.8
VGG-Small1 / 190.4
ResNet-181 / 191.5

​ ImageNet:

TopologyBit-Width (W/A)Top-1 (%)Top-5 (%)
ResNet-181 / 158.180.0
ResNet-181 / 3266.586.8
ResNet-341 / 162.984.1
ResNet-341 / 3270.489.5