Awesome
[🌟ECCV2024 Poster🌟]EventBind: Learning a Unified Representation to Bind Them All for Event-based Open-world Understanding (Project Page)
This repository contains the official PyTorch implementation of the paper "EventBind: Learning a Unified Representation to Bind Them All for Event-based Open-world Understanding" paper.
<div align="center"> <img src="image/EventBind.png" width="1300px"> </div>Citation
If you find this paper useful, please consider staring 🌟 this repo and citing 📑 our paper:
@article{zhou2023clip,
title={E-CLIP: Towards Label-efficient Event-based Open-world Understanding by CLIP},
author={Zhou, Jiazhou and Zheng, Xu and Lyu, Yuanhuiyi and Wang, Lin},
journal={arXiv e-prints},
pages={arXiv--2308},
year={2023}
}
Quick Start
- Refer to install.md for step-by-step guidance on how to install the packages.
- Download the ViT-B-32, ViT-B-16, ViT-L-14 CLIP pretrained backbone in this repository.
- Download the dataset and its corresponding model checkpoints in the following Dataset section and Checkpoints section, respectively. Note that the train-val split for N-Caltech101/Caltech101 and N-MNIST/MNIST dataset are provided in Dataloader folder to ensure the fairness of future comparison. and we follow the N-Imagenet/Imagenet dataset's official train-val split.
- Change settings of the dataset_name.yaml in the Configs folder, which are emphasized by TODO notes.
- Finally, train and evaluate the EventBind using the following command!
python ./EventBind/train_dp_dataset_name.py
Checkpoints
<div align=center>Datasets | Access to Model checkpoints |
---|---|
N-Caltech101 | ViT-B-32, ViT-B-16, ViT-L-14 |
N-MINIST | ViT-B-32, ViT-B-16, ViT-L-14 |
N-Imagenet | ViT-B-32, ViT-B-16, ViT-L-14 |
Dataset
Please refer to the .txt files in the Dataloader folder for the dataset structure.
<div align=center>Event Datasets | Acesse to Download | Corresponding Image Datasets | Acesse to Download |
---|---|---|---|
N-Caltech101 | Download | Caltech101 | Download |
N-Imagenet | Download | Imagenet | Download |
N-MINIST | Download | MINIST | Download |
Dependencies
Please refer to install.md for step-by-step guidance on how to install the packages.
️ ️Acknowledgement
We thank the authors of CLIP, CoOp for opening source their wonderful works.
License
This repository is released under the MIT License.
Contact
If you have any questions about this project, please open an issue in this repository or send an email to jiazhou.garland@gmail.com.