Awesome
DARE-GRAM
[CVPR 2023] Code for our paper DARE-GRAM : Unsupervised Domain Adaptation Regression by Aligning Inversed Gram Matrices
<img src="./images/main_methode.jpg" alt="alt text" width="200%" height="150%">Prerequisites
- Python3
- Numpy
- PyTorch == 1.12.1 (with CUDA and CuDNN (cu113))
- torchvision == 0.13.1
- PIL
- scikit-learn
Please create and activate the following conda envrionment. To reproduce our results, please kindly create and use this environment.
# It may take several minutes for conda to solve the environment
conda update conda
conda env create -f environment.yml
conda activate daregram
Train and Test DARE-GRAM model
The program can be run with the default parameters using the following:
#Train for dSprites
cd code/dsprites
sh dare_gram.sh
#Train for MPI3D
cd code/mpi3d
sh dare_gram.sh
Code was tested on a RTX 3090.
Citation
Please cite our work if you find it useful.
@inproceedings{nejjar2023domain,
title={DARE-GRAM : Unsupervised Domain Adaptation Regression by Aligning Inversed Gram Matrices},
author={Nejjar, Ismail and Wang, Qin and Fink, Olga},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.},
year={2023}
}
Acknowledgement
- RSD is used as our codebase and our DA baseline official
Data links
dSprites can be downloaded from: here
MPI3D can be downloaded from here.
The files should be unziped and put in distinctive folders (template was provided).
Contact
For questions regarding the code, please contact ismail.nejjar@epfl.ch.