Home

Awesome

Domain Adaptation via Prompt Learning

Overview of Domain Adaptation via Prompt Learning

<div align="center">Overview of Domain Adaptation via Prompt Learning (DAPL)</div>

[Paper] [Code]

Domain adaption via prompt learning (DAPL), extends from CLIP and CoOp, offers a simple solution to the domain adaption problem. The prompt consists of three parts: domain-agnostic context, domain-specific context, and class label (token).

Our method is to tailor the powerful CLIP for UDA by designing trainable domain-agnostic, domain-specific and class prompt. By learning the representation of the prompt, our method actually learns a conditional propability distribution to deal with distribution shift. Hence, our method learns different decision boundaries for each domain. Moreover, we show that this allows disentanglement of semantic and domain representation with contrastive learning.

Performance of DAPrompt

We evaluate our method on three benchmarks: VisDA-2017, mini-DomainNet and Office-Home.

<div align="center">VisDA-2017</div>
ResNet101ViT-B/16
CLIP84.488.7
CLIP+FT74.580.5
DAPrompt86.989.8
<div align="center">mini-DomainNet</div>
ModelMethodc-pc-rc-sp-cp-rp-sr-cr-pr-ss-cs-ps-rAvg
CLIP67.984.862.969.184.862.969.267.962.969.167.984.871.2
ResNet50CLIP-FT58.973.552.560.279.552.962.965.755.761.951.872.962.4
DAPrompt72.487.665.972.787.665.673.272.466.273.872.987.874.8
CLIP80.390.577.882.790.577.882.780.377.882.780.390.582.8
Vit-B/16CLIP-FT72.284.371.379.584.367.580.376.575.980.270.083.777.1
DAPrompt83.392.481.186.492.181.086.783.380.886.883.591.985.8
<div align="center">Office-Home</div>
ModelMethodA-CA-PA-RC-AC-PC-RP-AP-CP-RR-AR-CR-PAvg
CLIP51.681.982.671.981.982.671.951.682.671.951.681.972
ResNet50CLIP-FT44.967.474.561.469.170.461.045.477.670.549.081.464.4
DAPrompt54.184.384.874.483.78574.554.684.875.254.783.874.5
CLIP67.889.089.882.989.089.882.967.889.882.967.889.082.4
Vit-B/16CLIP-FT64.379.484.477.683.983.873.566.886.379.067.088.777.9
DAPrompt70.791.090.985.291.091.085.170.790.985.370.491.484.4

How to Install

Our code is built based on the source code of CoOp. So you need to install some dependent environments.

# install clip
pip install ftfy regex tqdm
pip install git+https://github.com/openai/CLIP.git

# clone dapl
git clone https://github.com/LeapLabTHU/DAPrompt.git

# install dassl
git clone https://github.com/KaiyangZhou/Dassl.pytorch.git
cd dassl
pip install -r requirements.txt
pip install .
cd ..

# you may download clip weights and modify the path to clip weights in clip file, or it could be downloaded automatically

You may follow the installation guide from CLIP and dassl.

Download Datasets

VisDA is a dataset from VisDA 2017 challenge. It contains two domains, i.e., 152397 synthetic images and 55388 real images.

Home page for VisDA dataset

Download the VisDA-classification dataset

Office-Home is a dataset comprised of 4 domains and 65 categories. It has a total of 15,588 images.

Home page for Office-Home dataset

Downliad the Office-Home dataset

How to Run

We provide the running scripts in scripts/. Make sure you change the path in DATA and run the commands under CoOp/scripts/.

Training

The commond is in the file DAPL/scripts/main.sh, which contains six input arguments:

Below we provide examples on how to run DAPL on VisDA-2017. The file DAPL/scripts/main.sh defines the path to dataset in the line 6. You may set it as the true path to your dataset. If you want to train DAPL on the VisDA-2017 dataset, you may run the below command in the path DAPL/scripts:

bash main.sh visda17 ep25-32-csc 1.0 0.5 1.0 t0

Load a pre-trained Model

We have upload a pretrained weight. You can load it and evaluate in the target domain. The command is

bash eval.sh visda17 ep25-32-csc 1.0 0.5 1.0 t0

How to Develop New Algorithm

The structure of our lib:

If you want to define a new method NewDA, you may need to develop the project according to the following guide:

bash main.sh visda17 ep25 1.0 0.5 1.0 t0

If you want to add new dataset NewData, you may follow:

For new algorithm development, there are some dependency useful to read:

Acknowledgement

Thanks for the following projects:

How to Contect Us

You can send an e-mail to gecj20 at mails.tsinghua.edu.cn if you have queations.

How to Cite DAPL

If you use this code in your research, please kindly cite the following paper

@article{ge2022domain,
  title={Domain Adaptation via Prompt Learning},
  author={Ge, Chunjiang and Huang, Rui and Xie, Mixue and Lai, Zihang and Song, Shiji and Li, Shuang and Huang, Gao},
  journal={arXiv preprint arXiv:2202.06687},
  year={2022}
}