Home

Awesome

EcoFormer: Energy-Saving Attention with Linear Complexity(NeurIPS 2022 Spotlight) 🚀

<a href="https://arxiv.org/abs/2209.09004"><img src="https://img.shields.io/badge/arXiv-2209.09004-b31b1b.svg" height=22.5></a> License <a href="https://pytorch.org/get-started/locally/"><img alt="PyTorch" src="https://img.shields.io/badge/PyTorch-ee4c2c?logo=pytorch&logoColor=white"></a>

This is the official PyTorch implementation of EcoFormer: Energy-Saving Attention with Linear Complexity by Jing Liu, Zizheng Pan, Haoyu He, Jianfei Cai, and Bohan Zhuang.

News

A Gentle Introduction

EcoFormer

We present a novel energy-saving attention mechanism with linear complexity, called EcoFormer, to save the vast majority of multiplications from a new binarization perspective. More details can be found in our paper.

Installation

Requirements

Instructions

Use Anaconda to create the running environment for the project, kindly run

git clone https://github.com/ziplab/EcoFormer
cd EcoFormer
conda env create -f environment/environment.yml
conda activate ecoformer

Note: If the above instructions does not work on your machine, please refer to environment/README.md for manual installation and trouble shootings.

Getting Started

For experiments on PVTv2, please refer to pvt.

For experiments on twins, please refer to twins.

Results and Model Zoo

Model#Mul. (B)#Add. (B)Energy (B pJ)Throughput (images/s)Top-1 Acc. (%)Download
PVTv2-B00.540.562.5137970.44Github
PVTv2-B12.032.099.487478.38Github
PVTv2-B23.853.9717.848381.28Github
PVTv2-B36.546.7530.2532581.96Github
PVTv2-B49.579.8244.2524981.90Github
Twins-SVT-S2.722.8112.657680.22Github

Citation

If you find EcoFormer useful in your research, please consider to cite the following related papers:

@inproceedings{liu2022ecoformer,
  title={EcoFormer: Energy-Saving Attention with Linear Complexity},
  author={Liu, Jing and Pan, Zizheng and He, Haoyu and Cai, Jianfei and Zhuang, Bohan},
  booktitle={NeurIPS},
  year={2022}
}

License

This repository is released under the Apache 2.0 license as found in the LICENSE file.

Acknowledgement

This repository is built upon PVT and Twins. We thank the authors for their open-sourced code.