Home

Awesome

RSKP

Weakly Supervised Temporal Action Localization via Representative Snippet Knowledge Propagation (CVPR 2022)<br> Linjiang Huang (CUHK), Liang Wang (CASIA), Hongsheng Li (CUHK)

arXiv CVPR2022

Overview

The experimental results on THUMOS14 are as below.

Method \ mAP(%)@0.1@0.2@0.3@0.4@0.5@0.6@0.7AVG
UntrimmedNet44.437.728.221.113.7---
STPN52.044.735.525.816.99.94.327.0
W-TALC55.249.640.131.122.8-7.6-
AutoLoc--35.829.021.213.45.8-
CleanNet--37.030.923.913.97.1-
MAAN59.850.841.130.620.312.06.931.6
CMCS57.450.841.232.123.115.07.032.4
BM60.456.046.637.526.817.69.036.3
RPN62.357.048.237.227.916.78.136.8
DGAM60.054.246.838.228.819.811.437.0
TSCN63.457.647.837.728.719.410.237.8
EM-MIL59.152.745.536.830.522.716.437.7
BaS-Net58.252.344.636.027.018.610.435.3
A2CL-PT61.256.148.139.030.119.210.637.8
ACM-BANet64.657.748.940.932.321.913.539.9
HAM-Net65.459.050.341.131.020.711.139.8
ACSNet--51.442.732.422.011.7-
WUM67.561.252.343.433.722.912.141.9
AUMN66.261.954.944.433.320.59.041.5
CoLA66.259.551.541.932.222.013.140.9
ASL67.0-51.8-31.1-11.4-
TS-PCA67.661.153.443.434.324.713.742.6
UGCT69.262.955.546.535.923.811.443.6
CO2-Net70.163.654.545.738.326.413.444.6
D2-Net65.760.252.343.436.0---
FAC-Net67.662.152.644.333.422.512.742.2
Ours71.365.355.847.638.225.412.545.1

Prerequisites

Recommended Environment

Note: Our code works with different PyTorch and CUDA versions, for high version of Pytorch, you need to change one line of our code according to this issue.

Data Preparation

  1. Prepare THUMOS'14 dataset.

    • We recommend using features and annotations provided by this repo.
  2. Place the features and annotations inside a dataset/Thumos14reduced/ folder.

Usage

Training

You can easily train the model by running the provided script.

$ python main.py --run-type 0 --model-id 1

Models are saved in ./ckpt/dataset_name/model_id/

Evaulation

The trained model can be found here. (This saved model's result is slightly different from the one reported in our paper.)

Please put it into ./ckpt/dataset_name/model_id/.

$ python main.py --pretrained --run-type 1 --model-id 1 --load-epoch xxx

Please refer to the log in the same folder of saved models to set the load epoch of the best model. Make sure you set the right model-id that corresponds to the model-id during training.

References

We referenced the repos below for the code.

Citation

@InProceedings{rskp,
  title={Weakly Supervised Temporal Action Localization via Representative Snippet Knowledge Propagation},
  author={Huang, Linjiang and Wang, Liang and Li, Hongsheng},
  booktitle={IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2022}
}

Contact

If you have any question or comment, please contact the first author of the paper - Linjiang Huang (ljhuang524@gmail.com).