Home

Awesome

ANTICIPATR

This repository contains the codebase for Anticipation Transformer (Anticipatr) proposed in the ECCV'22 paper.


Model

<div align='center'> <img src='assets/model.png' width='512px'> </div>

Getting started

Our method proposes a two-stage training method for the task of long-term action anticipation.

For our training setup, we use these directories.


Citation

@inproceedings{nawhal2022anticipatr,
  title={Rethinking Learning Approaches for Long-Term Action Anticipation},
  author={Nawhal, Megha and Jyothi, Akash Abdu and Mori, Greg},
  booktitle={Proceedings of the European Conference on Computer Vision},
  year={2022}
}

Contact

For further questions, please email Megha Nawhal.