Home

Awesome

Codebase Readme

This is the codebase for paper: "Revisiting Unsupervised Domain Adaptation Models: a Smoothness Perspective" (ACCV 2022)

Environment

conda env create -f leco.yaml
conda activate leco

Prepare the datasets

Office-31 can be found here.
Office-Home can be found here.
Visda-C can be found here.
DomainNet can be found here.

Training guides

Visda-C

For MCC:

python da_visda.py --dset visda --lr 0.001 --net resnet101 --gpu_id 0 --batch_size 36 --base MCC --method Blank --interval 2 --s 0 --t 1

For MCC + LECO:

python da_visda.py --dset visda --lr 0.001 --net resnet101 --gpu_id 0 --batch_size 36 --base MCC --method LECO --interval 2 --s 0 --t 1 --warm_up 3000 --lamda 3

We set seed=[2020, 2021, 2022], showing the stable improvements to MCC. Logs can refer to TV.

MethodsplanebcyclbuscarhorseknifemcyclpersonplantsktbrdtraintruckPer-class
MCC(2020)94.380.3575.9364.0392.4597.1685.2383.1289.2386.0182.1153.2681.93
MCC(2021)93.6984.0676.3565.7191.3994.9486.0477.6292.4489.5781.5254.2982.30
MCC(2022)93.2581.1873.7357.2390.9471.0883.0977.0582.6386.9481.8955.7377.90
MCC+LeCo(2020)97.1285.9683.8689.6696.5597.4589.0684.0595.9190.7985.0843.8286.61
MCC+LeCo(2021)95.7286.3386.4691.5596.1896.8292.5374.1896.0792.8584.0738.0985.90
MCC+LeCo(2022)96.4987.0279.1790.4695.8696.4391.2482.5594.5592.4288.3640.5786.26

For CDAN:

python da_visda.py --dset visda --lr 0.01 --net resnet101 --gpu_id 0 --batch_size 36 --base CDAN --method Blank --interval 2 --s 0 --t 1 --warm_up 3000 --lamda 3 --lr_decay2 0.1

For CDAN + LECO:

python da_visda.py --dset visda --lr 0.01 --net resnet101 --gpu_id 0 --batch_size 36 --base CDAN --method LECO --interval 2 --s 0 --t 1 --warm_up 3000 --lamda 0.5 --lr_decay2 0.1

For BNM:

python da_visda.py --dset visda --lr 0.001 --net resnet101 --gpu_id 0 --batch_size 36 --base BNM --method Blank --interval 2 --s 0 --t 1

FOr BNM + LeCo:

python da_visda.py --dset visda --lr 0.001 --net resnet101 --gpu_id 0 --batch_size 36 --base BNM --method LECO --interval 2 --s 0 --t 1 --warm_up 3000 --lamda 2

Office-home

For MCC:

python da_home.py --dset office-home --lr 0.01 --net resnet50 --gpu_id 0 --batch_size 36 --base MCC --method Blank --interval 2

For MCC + LECO:

python da_home.py --dset office-home --lr 0.01 --net resnet50 --gpu_id 0 --batch_size 36 --base MCC --method LECO --interval 2 --warm_up 3000 --lamda 2

For CDAN:

python da_visda.py --dset visda --lr 0.01 --net resnet101 --gpu_id 0 --batch_size 36 --base CDAN --method Blank --interval 2 --lr_decay2 0.1

For CDAN + LECO:

python da_visda.py --dset visda --lr 0.01 --net resnet101 --gpu_id 0 --batch_size 36 --base CDAN --method LECO --interval 2 --warm_up 3000 --lamda 2 --lr_decay2 0.1

For BNM

python da_visda.py --dset visda --lr 0.01 --net resnet101 --gpu_id 0 --batch_size 36 --base BNM --method Blank --interval 2

For BNM + LECO

python da_visda.py --dset visda --lr 0.01 --net resnet101 --gpu_id 0 --batch_size 36 --base BNM --method LECO --interval 2 --lambda 3

DomainNet

For MCC + LECO

python da_domainNet.py --dset com-dn --lr 0.01 --net resnet101 --gpu_id 0 --batch_size 36 --base MCC --method LECO --interval 5 --warm_up 3000 --lamda 2

Office-31

This code file is borrowed from BNM. And you need to specify the source and target domain like follows:
For baseline: MCC, and method: LECO

python da_office.py --gpu_id 0 --base MCC --method LECO --num_iterations 8004  --dset office --s dslr --t amazon --test_interval 2000  --lambda_method 3

Visualization

Intra-class variance and inter-class variance visualization can refer to files (cal_cluster_intra.py, cal_cluster_inter.py).

Validation

Choosing the best hyper-parameter can refer to file: dev_loss.py.

BibTeX

@inproceedings{wang2022revisiting,
  title={Revisiting Unsupervised Domain Adaptation Models: a Smoothness Perspective},
  author={Wang, Xiaodong and Zhuo, Junbao and Zhang, Mengru and Wang, Shuhui and Fang, Yuejian},
  booktitle={Proceedings of the Asian Conference on Computer Vision},
  pages={1504--1521},
  year={2022}
}