Home

Awesome

BiMatting: Efficient Video Matting via Binarization

This project is the official implementation of our accepted NeurIPS 2023 paper BiMatting: Efficient Video Matting via Binarization [PDF]. Created by researchers from Beihang University and ETH Zürich.

loading-ag-172

Introduction

Real-time video matting on edge devices faces significant computational resource constraints, limiting the widespread use of video matting in applications such as online conferences and short-form video production. Binarization is a powerful compression approach that greatly reduces computation and memory consumption by using 1-bit parameters and bitwise operations. However, binarization of the video matting model is not a straightforward process, and our empirical analysis has revealed two primary bottlenecks: severe representation degradation of the encoder and massive redundant computations of the decoder. To address these issues, we propose BiMatting, an accurate and efficient video matting model using binarization. Specifically, we construct shrinkable and dense topologies of the binarized encoder block to enhance the extracted representation. We sparsify the binarized units to reduce the low-information decoding computation. Through extensive experiments, we demonstrate that BiMatting outperforms other binarized video matting models, including state-of-the-art (SOTA) binarization methods, by a significant margin. Our approach even performs comparably to the full-precision counterpart in visual quality. Furthermore, BiMatting achieves remarkable savings of 12.4$\times$ and 21.6$\times$ in computation and storage, respectively, showcasing its potential and advantages in real-world resource-constrained scenarios.

Dependencies

# Go to the default directory
pip install -r requirements.txt

Execution

# We provide script to train and test our model
sh scripts/train.sh
sh scripts/test.sh

Results

VM512x512VM1920x1080
pha_madpha_msepha_gradpha_connpha_dtssdfgr_madfgr_msepha_madpha_msepha_gradpha_dtssd
stage115.068.752.831.762.7042.4614.8119.3311.6327.733.42
stage213.507.023.321.522.6946.2815.3919.6811.8029.643.39
stage312.676.892.781.402.6539.0713.3017.9511.4122.133.18
stage412.826.652.971.422.69363.69213.7717.7110.7922.103.24
paper12.826.652.971.422.69363.69213.7718.1611.1521.902.25

Citation

If you find our work useful in your research, please consider citing:

@inproceedings{qin2023bimatting,
    author={Haotong Qin and Lei Ke and Xudong Ma and Martin Danelljan and Yu-Wing Tai and Chi-Keung Tang and Xianglong Liu and Fisher Yu},
    title={BiMatting: Efficient Video Matting via Binarization},
    booktitle={Thirty-seventh Conference on Neural Information Processing Systems},
    year={2023},
}