Home

Awesome

SSL models are Strong UDA learners

highlights

Introduction

This is the UDA and "UDA + SSL" part of the official code of paper "Semi-supervised Models are Strong Unsupervised Domain Adaptation Learners". It is based on pure PyTorch and presents the high effectiveness of SSL methods on UDA tasks. You can easily develop new algorithms, or readily apply existing algorithms. Codes for SSL methods are given here.

The currently supported algorithms include:

Unsupervised learning for unsupervised domain adatation

highlights

UDA + SSL

MethodOffice31OfficeHomeVisDA-2017<br>(Inductive)DomainNet
MDD88.768.573.7±0.229.7
MDD+Consistency89.071.276.4±3.932.4
MCC89.870.878.3±0.428.2
MCC+Consistency89.972.283.1±0.828.8

From this form we can see that the consistency gives an improvement of performance.

Installation

This implementation is based on a variant of Transfer-Learn codebase, where we slightly change the data transform and add two new method: mdd_consistency and mcc_consistency. Please refer to Transfer-Learn codebase for installation.

Usage

The commands of our experiments on a certain dataset are written in dataset_name.sh, such as domainnet.sh. It can be run after Transfer-Learn is installed.

Running Experiments

cd Transfer-Learning-Library/examples/domain_adaptation/classification
sh office31.sh
sh officehome.sh
sh visda.sh
sh domainnet.sh

We don't provide the checkpoints since the training of each model is quick and there are too many tasks.

Contributing

Any pull requests or issues are welcome. Models of other SSL methods on UDA tasks are highly expected.

Citation

If you use this toolbox or benchmark in your research, please cite this project.

@inproceedings{SSL2UDA,
  author = {xxx},
  title = {Semi-supervised Models are Strong Unsupervised Domain Adaptation Learners},
  year = {2021},
  publisher = {xxx},
  journal = {xxx},
}

Acknowledgment

We would like to thank Transfer Learning Library for their excellent contribution.

License

MIT License, the same to Transfer Learning Library.

Below are original content of Transfer-Learn library's Readme.


<img src="https://github.com/thuml/Transfer-Learning-Library/blob/dev/TransLearn.png"/>

Introduction

Transfer-Learn is an open-source and well-documented library for Transfer Learning. It is based on pure PyTorch with high performance and friendly API. Our code is pythonic, and the design is consistent with torchvision. You can easily develop new algorithms, or readily apply existing algorithms.

The currently supported algorithms include:

Domain Adaptation for Classification
Partial Domain Adaptation
Open-set Domain Adaptation
Domain Adaptation for Segmentation
Domain Adaptation for Keypoint Detection
Finetune for Classification

We are planning to add

The performance of these algorithms were fairly evaluated in this benchmark.

Installation

For flexible use and modification, please git clone the library.

Documentation

You can find the tutorial and API documentation on the website: Documentation (please open in Firefox or Safari). Note that this link is only for temporary use. You can also build the doc by yourself following the instructions in http://170.106.108.162/get_started/faq.html.

Also, we have examples in the directory examples. A typical usage is

# Train a DANN on Office-31 Amazon -> Webcam task using ResNet 50.
# Assume you have put the datasets under the path `data/office-31`, 
# or you are glad to download the datasets automatically from the Internet to this path
python dann.py data/office31 -d Office31 -s A -t W -a resnet50  --epochs 20

In the directory examples, you can find all the necessary running scripts to reproduce the benchmarks with specified hyper-parameters.

Contributing

We appreciate all contributions. If you are planning to contribute back bug-fixes, please do so without any further discussion. If you plan to contribute new features, utility functions or extensions, please first open an issue and discuss the feature with us.

Disclaimer on Datasets

This is a utility library that downloads and prepares public datasets. We do not host or distribute these datasets, vouch for their quality or fairness, or claim that you have licenses to use the dataset. It is your responsibility to determine whether you have permission to use the dataset under the dataset's license.

If you're a dataset owner and wish to update any part of it (description, citation, etc.), or do not want your dataset to be included in this library, please get in touch through a GitHub issue. Thanks for your contribution to the ML community!

Contact

If you have any problem with our code or have some suggestions, including the future feature, feel free to contact

or describe it in Issues.

For Q&A in Chinese, you can choose to ask questions here before sending an email. 迁移学习算法库答疑专区

Citation

If you use this toolbox or benchmark in your research, please cite this project.

@misc{dalib,
  author = {Junguang Jiang, Bo Fu, Mingsheng Long},
  title = {Transfer-Learning-library},
  year = {2020},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/thuml/Transfer-Learning-Library}},
}

Acknowledgment

We would like to thank School of Software, Tsinghua University and The National Engineering Laboratory for Big Data Software for providing such an excellent ML research platform.