Home

Awesome

CoNMix for Source-free Single and Multi-target Domain Adaptation

Official pytorch implementation for CoNMix (website)

Accepted at IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)-2023, Waikoloa, Hawaii

Vikash Kumar, Rohit Lal, Himanshu Patil, Anirban Chakraborty


Abstract

This work introduces the novel task of Source-free Multi-target Domain Adaptation and proposes adaptation framework comprising of Consistency with Nuclear-Norm Maximization and MixUp knowledge distillation CoNMix as a solution to this problem.

image info

Getting Started

Download Data

Manually download the dataset Office-Home

<details> <summary>Click to see full directory tree</summary>
   data
    ├── office-home
        ├── Art
        ├── Art.txt
        ├── Clipart
        ├── Clipart.txt
        ├── Product
        ├── Product.txt
        ├── Real_World
        └── RealWorld.txt

</details>

Download Models

Get models in this link: R50-ViT-B_16, ViT-B_16, ViT-L_16...

wget https://storage.googleapis.com/vit_models/imagenet21k/R50+ViT-B_16.npz
mkdir -p model/vit_checkpoint/imagenet21k
mv R50+ViT-B_16.npz model/vit_checkpoint/imagenet21k/R50+ViT-B_16.npz

Prerequisites:

Training

Install the dependencies and run scripts.

Stage 1: Source only Training

We train a simple source free model using supervised loss.

sh scripts/source_train.sh

Stage 2: STDA training

Source Free Single Target domain adaptaion training

sh scripts/STDA.sh

Stage 3: KD-MTDA training

Source Free Multi Target Domain adaptation. This must be trained after completing STDA training.

sh scripts/MTDA.sh

Testing

For testing any model use the test_model_acc.py code. There are two function

Changes to be done in code

Code Reference

Citation

@inproceedings{kumar2023conmix,
  title={CoNMix for Source-free Single and Multi-target Domain Adaptation},
  author={Kumar, Vikash and Lal, Rohit and Patil, Himanshu and Chakraborty, Anirban},
  booktitle={Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision},
  pages={4178--4188},
  year={2023}
}