Home

Awesome

Continuously Indexed Domain Adaptation (CIDA)

This is the authors' official PyTorch implementation for CIDA. This repo contains code for experiments in the ICML 2020 paper 'Continuously Indexed Domain Adaptation'.

Outline for This README

Beyond Domain Adaptation: Brief Introduction for CIDA

Essentially CIDA asks the question of whether and how to go beyond current (categorical) domain adaptation regime and proposes the first approach to adapt across continuously indexed domains. For example, instead of adapting from domain A to domain B, we would like to simultaneously adapt across infintely many domains in a manifold. This allows us to go beyond domain adaption and perform both domain interpolation and domain extrapolation. See the following toy example.

<p align="center"> <img src="fig/blog-circle.png" alt="" data-canonical-src="fig/blog-circle.png" width="95%"/> </p>

Naturally categorical domain adaptation works on a finite number of domains while CIDA works on infinitely many domains. See the following comparison.

<p align="center"> <img src="fig/blog-CategoricalDA-vs-CIDA.png" alt="" data-canonical-src="fig/blog-CategoricalDA-vs-CIDA.png" width="95%"/> </p>

For a more visual introduction, feel free to take a look at this video.

Sample Results

If we use domains [1, 6] as source domains and the rest as target domains, below are some sample results from previous domain adaptation methods and CIDA, where CIDA successfully learns how the decision boundary evolves with the domain index.

<p align="center"> <img src="fig/blog-circle-ADDA-vs-CIDA-with-GT.png" alt="" data-canonical-src="fig/blog-circle-ADDA-vs-CIDA-with-GT.png" width="95%"/> </p>

Method Overview

We provide a simple yet effective learning framework with theoretical guarantees (see the Theory section at the end of this README). Below is a quick comparison between previous domain adaptation methods and CIDA (differences marked in red).

<p align="center"> <img src="fig/blog-method-DA-vs-CIDA.png" alt="" data-canonical-src="fig/blog-method-DA-vs-CIDA.png" width="95%"/> </p>

IPython Notebooks and Environment

Below are some IPython Notebooks for the experiments. We strongly recommend starting from the simplest case, i.e., Experiments for Toy Datasets (Quarter Circle) to get familar with the data and settings.

IPython Notebooks

Besides using IPython notebooks, you can also directly run the following command for the Rotating MNIST experiments inside the folder 'rotatingMNIST':

bash run_all_exp.sh

Environment

Quantitative Results

Rotating MNIST

<p align="center"> <img src="fig/blog-table-mnist.png" alt="" data-canonical-src="fig/blog-table-mnist.png" width="95%"/> </p>

Intra-Dataset Results on Real-World Medical Datasets

In the intra-dataset setting, we consider both domain extrapolation and domain interpolation (see the figure below).

<p align="center"> <img src="fig/blog-domain-inter-extra.png" alt="" data-canonical-src="fig/blog-domain-inter-extra.png" width="95%"/> </p> <p align="center"> <img src="fig/blog-table-intra.png" alt="" data-canonical-src="fig/blog-table-intra.png" width="95%"/> </p>

Cross-Dataset Results on Real-World Medical Datasets

<p align="center"> <img src="fig/blog-table-cross.png" alt="" data-canonical-src="fig/blog-table-cross.png" width="85%"/> </p>

Multi-Dimensional CIDA on Real-World Medical Datasets

<p align="center"> <img src="fig/blog-table-multi.png" alt="" data-canonical-src="fig/blog-table-multi.png" width="55%"/> </p>

Theory

Denoting the domain index as u and the encoding as z, we have (check the paper for full theorems):

Also Check Out Relevant Work

Graph-Relational Domain Adaptation<br> Zihao Xu, Hao He, Guang-He Lee, Yuyang Wang, Hao Wang<br> Tenth International Conference on Learning Representations (ICLR), 2022<br> [Paper] [Code] [Talk] [Slides]

Reference

Continuously Indexed Domain Adaptation

@inproceedings{DBLP:conf/icml/WangHK20,
  author    = {Hao Wang and
               Hao He and
               Dina Katabi},
  title     = {Continuously Indexed Domain Adaptation},
  booktitle = {ICML},
  year      = {2020}
}