Home

Awesome

ReActNet

This is the pytorch implementation of our paper "ReActNet: Towards Precise Binary NeuralNetwork with Generalized Activation Functions", published in ECCV 2020.

<div align=center> <img width=60% src="https://github.com/liuzechun0216/images/blob/master/reactnet_github.jpg"/> </div>

In this paper, we propose to generalize the traditional Sign and PReLU functions to RSign and RPReLU, which enable explicit learning of the distribution reshape and shift at near-zero extra cost. By adding simple learnable bias, ReActNet achieves 69.4% top-1 accuracy on Imagenet dataset with both weights and activations being binary, a near ResNet-level accuracy.

Citation

If you find our code useful for your research, please consider citing:

@inproceedings{liu2020reactnet,
  title={ReActNet: Towards Precise Binary Neural Network with Generalized Activation Functions},
  author={Liu, Zechun and Shen, Zhiqiang and Savvides, Marios and Cheng, Kwang-Ting},
  booktitle={European Conference on Computer Vision (ECCV)},
  year={2020}
}

Run

1. Requirements:

2. Data:

3. Steps to run:

(1) Step1: binarizing activations

(2) Step2: binarizing weights + activations

Models

MethodsTop1-AccFLOPsTrained Model
XNOR-Net51.2%1.67 x 10^8-
Bi-Real Net56.4%1.63 x 10^8-
Real-to-Binary65.4%1.83 x 10^8-
ReActNet (Bi-Real based)65.9%1.63 x 10^8Model-ReAct-ResNet
ReActNet-A69.5%0.87 x 10^8Model-ReAct-MobileNet

Contact

Zechun Liu, HKUST (zliubq at connect.ust.hk)

Zhiqiang Shen, CMU (zhiqians at andrew.cmu.edu)