Home

Awesome

AVCE_FER

Emotion-aware Multi-view Contrastive Learning for Facial Emotion Recognition (ECCV 2022)<br>

<a href="https://releases.ubuntu.com/16.04/"><img alt="Ubuntu" src="https://img.shields.io/badge/Ubuntu-16.04-green"></a> <a href="https://www.python.org/downloads/release/python-370/"><img alt="PyThon" src="https://img.shields.io/badge/Python-v3.8-blue"></a> <a href="https://pytorch.org/get-started/locally/"><img alt="PyTorch" src="https://img.shields.io/badge/PyTorch-ee4c2c?logo=pytorch&logoColor=white"></a>

Daeha Kim, Byung Cheol Song

CVIP Lab, Inha University

Real-time demo with pre-trained weights

<p align="center"> <img src="https://github.com/kdhht2334/AVCE_FER/blob/main/AVCE_demo/AVCE_demo_vid.gif" height="320"/> </p>

Requirements

To install all dependencies, do this.

pip install -r requirements.txt

News

[22.07.10]: Add source code and demo.

[22.07.07]: OPEN official pytorch version of AVCE_FER.

Datasets

  1. Download three public benchmarks for training and evaluation (I cannot upload datasets due to the copyright issue).

(For more details visit website)

  1. Follow preprocessing rules for each dataset by referring pytorch official custom dataset tutorial.

Pretrained weights

Run

  1. Go to /src.

  2. Train AVCE.

  3. (Or) Execute run.sh

CUDA_VISIBLE_DEVICES=0 python main.py --freq 250 --model alexnet --online_tracker 1 --data_path <data_path> --save_path <save_path>
ArgumentsDescription
freqParameter saving frequency.
modelCNN model for backbone. Choose from 'alexnet', and 'resnet18'.
online_trackerWandb on/off.
data_pathPath to load facial dataset.
save_pathPath to save weights.

Real-time demo

  1. Go to /AVCE_demo.

  2. Run main.py.

Citation

@inproceedings{kim2022emotion,
	title={Emotion-aware Multi-view Contrastive Learning for Facial Emotion Recognition},
	author={Kim, Daeha and Song, Byung Cheol},
	booktitle={European Conference on Computer Vision},
	pages={178--195},
	year={2022},
	organization={Springer}

}

Contact

If you have any questions, feel free to contact me at kdhht5022@gmail.com.