Home

Awesome

[🌟ECCV2024 Poster🌟]EventBind: Learning a Unified Representation to Bind Them All for Event-based Open-world Understanding (Project Page)

This repository contains the official PyTorch implementation of the paper "EventBind: Learning a Unified Representation to Bind Them All for Event-based Open-world Understanding" paper.

<div align="center"> <img src="image/EventBind.png" width="1300px"> </div>

Citation

If you find this paper useful, please consider staring 🌟 this repo and citing 📑 our paper:

@article{zhou2023clip,
  title={E-CLIP: Towards Label-efficient Event-based Open-world Understanding by CLIP},
  author={Zhou, Jiazhou and Zheng, Xu and Lyu, Yuanhuiyi and Wang, Lin},
  journal={arXiv e-prints},
  pages={arXiv--2308},
  year={2023}
}

Quick Start

  1. Refer to install.md for step-by-step guidance on how to install the packages.
  2. Download the ViT-B-32, ViT-B-16, ViT-L-14 CLIP pretrained backbone in this repository.
  3. Download the dataset and its corresponding model checkpoints in the following Dataset section and Checkpoints section, respectively. Note that the train-val split for N-Caltech101/Caltech101 and N-MNIST/MNIST dataset are provided in Dataloader folder to ensure the fairness of future comparison. and we follow the N-Imagenet/Imagenet dataset's official train-val split.
  4. Change settings of the dataset_name.yaml in the Configs folder, which are emphasized by TODO notes.
  5. Finally, train and evaluate the EventBind using the following command!
python ./EventBind/train_dp_dataset_name.py

Checkpoints

<div align=center>
DatasetsAccess to Model checkpoints
N-Caltech101ViT-B-32, ViT-B-16, ViT-L-14
N-MINISTViT-B-32, ViT-B-16, ViT-L-14
N-ImagenetViT-B-32, ViT-B-16, ViT-L-14
</div>

Dataset

Please refer to the .txt files in the Dataloader folder for the dataset structure.

<div align=center>
Event DatasetsAcesse to DownloadCorresponding Image DatasetsAcesse to Download
N-Caltech101DownloadCaltech101Download
N-ImagenetDownloadImagenetDownload
N-MINISTDownloadMINISTDownload
</div>

Dependencies

Please refer to install.md for step-by-step guidance on how to install the packages.


️ ️Acknowledgement

We thank the authors of CLIP, CoOp for opening source their wonderful works.


License

This repository is released under the MIT License.


Contact

If you have any questions about this project, please open an issue in this repository or send an email to jiazhou.garland@gmail.com.