Home

Awesome

Local Context-Aware Active Domain Adaptation

Pytorch implementation of LADA.

Local Context-Aware Active Domain Adaptation
Tao Sun, Cheng Lu, and Haibin Ling
ICCV 2023

Abstract

Active Domain Adaptation (ADA) queries the labels of a small number of selected target samples to help adapting a model from a source domain to a target domain. The local context of queried data is important, especially when the domain gap is large. However, this has not been fully explored by existing ADA works.

In this paper, we propose a Local context-aware ADA framework, named LADA, to address this issue. To select informative target samples, we devise a novel criterion based on the local inconsistency of model predictions. Since the labeling budget is usually small, fine-tuning model on only queried data can be inefficient. We progressively augment labeled target data with the confident neighbors in a class-balanced manner.

Experiments validate that the proposed criterion chooses more informative target samples than existing active selection strategies. Furthermore, our full method surpasses recent ADA arts on various benchmarks.

<p align="center"> <img src="fig/framework.png" width="900"> <br> </p>

Usage

Prerequisites

We experimented with python==3.8, pytorch==1.8.0, cudatoolkit==11.1.

To start, download the office31, Office-Home, VisDA datasets and set up the path in ./data folder.

Supported methods

Active CriteriaPaperImplementation
Random-random
Entropy-entropy
Margin-margin
LeastConfidence-leastConfidence
CoreSetICLR 2018coreset
AADAWACV 2020AADA
BADGEICLR 2020BADGE
CLUEICCV 2021CLUE
MHPCVPR 2023MHP
LAS (ours)ICCV 2023LAS
Domain AdaptationPaperImplementation
Fine-tuning (joint label set)-ft_joint
Fine-tuning-ft
DANNJMLR 2016dann
MMEICCV 2019mme
MCCECCV 2020MCC
CDACCVPR 2021CDAC
RAA (ours)ICCV 2023RAA
LAA (ours)ICCV 2023LAA

Training

To obtain results of baseline active selection criteria on office home with 5% labeling budget,

for ADA_DA in 'ft' 'mme'; do
  for ADA_AL in 'random' 'entropy' 'margin' 'coreset' 'leastConfidence' 'BADGE' 'AADA' 'CLUE' 'MHP'; do
    python main.py --cfg configs/officehome.yaml --gpu 0 --log log/oh/baseline  ADA.AL $ADA_AL  ADA.DA $ADA_DA
  done
done

To reproduce results of LADA on office home with 5% labeling budget,

# LAS + fine-tuning with CE loss
python main.py --cfg configs/officehome.yaml --gpu 0 --log log/oh/LADA  ADA.AL LAS  ADA.DA ft
# LAS + MME model adaptation
python main.py --cfg configs/officehome.yaml --gpu 0 --log log/oh/LADA  ADA.AL LAS  ADA.DA mme
# LAS + Random Anchor set Augmentation (RAA)
python main.py --cfg configs/officehome.yaml --gpu 0 --log log/oh/LADA  ADA.AL LAS  ADA.DA RAA
# LAS + Local context-aware Anchor set Augmentation (LAA)
python main.py --cfg configs/officehome.yaml --gpu 0 --log log/oh/LADA  ADA.AL LAS  ADA.DA LAA 

More commands can be found in run.sh.

Acknowledgements

The pipline and implementation of baseline methods are adapted from CLUE and deep-active-learning. We adopt configuration files as EADA.

Citation

If you find our paper and code useful for your research, please consider citing

@article{sun2022local,
    author    = {Sun, Tao and Lu, Cheng and Ling, Haibin},
    title     = {Local Context-Aware Active Domain Adaptation},
    journal   = {IEEE/CVF International Conference on Computer Vision},
    year      = {2023}
}