Home

Awesome

Pixel-wise Warping for Video Stabilization (PWNet)

This is a PyTorch implementation of Pixel-wise Warping for Video Stabilization.

Source code and models will be opened soon!

If you have any questions, please contact with me: zmd1992@mail.ustc.edu.cn

Table of Contents

       

Prerequisites

Datasets

The dataset for is the DeepStab dataset (7.9GB) http://cg.cs.tsinghua.edu.cn/download/DeepStab.zip thanks to Miao Wang [1].

Training

mkdir weights
cd weights
wget https://s3.amazonaws.com/amdegroot-models/vgg16_reducedfc.pth
CUDA_VISIBLE_DEVICES=0,1 python3 main.py

Evaluation

To evaluate a trained network:

python eval.py

You can specify the parameters listed in the eval.py file by flagging them or manually changing them.

Performance

Detailed performance can be seen in our paper.

Demos

Use a pre-trained PWNet for video stabilization

Download a pre-trained network

Authors

References

[1] M. Wang, G.-Y. Yang, J.-K. Lin, S.-H. Zhang, A. Shamir, S.-P. Lu, and S.-M. Hu, “Deep online video stabilization with multi-grid warp- ing transformation learning,” IEEE Transactions on Image Processing, vol. 28, no. 5, pp. 2283–2292, 2019