Home

Awesome

SF-PGL

This work is the official Pytorch implementation of our papers:

Source-Free Progressive Graph Learning for Open-Set Domain Adaptation
Yadan Luo, Zijian Wang, Zhuoxiao Chen, Zi Huang, Mahsa Baktashmotlagh
Transcations on Pattern Analysis and Machine Intelligence (TPAMI)

Progressive Graph Learning for Open-Set Domain Adaptation
Yadan Luo^, Zijian Wang^, Zi Huang, Mahsa Baktashmotlagh
International Conference on Machine Learning (ICML) 2020
[Paper] [Code]


Framework

To further handle a more realistic yet challenging source-free setting, a novel SF-PGL framework was proposed, which leverages a balanced pseudo-labeling regime to enable uncertainty-aware progressive learning without relying on any distribution matching or adversarial learning methods. As an extension of PGL, we have significantly extended the idea of open-set domain adaptation from the unsupervised learning case to the source-free and semi-supervised settings, from image classification to action recognition, where the complex data interaction and more significant domain gap are addressed. We further discussed a hitherto untouched aspect of OSDA model - the model calibration issue. Experimental results evidenced that the SF-PGL can alleviate the class imbalance introduced by pseudo-labeled sets so that the overconfidence and under-confidence of the OSDA model can be avoided.

<p align="center"> <img src="docs/sf-pgl.png" width="70%"> </p>

Contents

<!-- * [Video Demo](#video-demo) -->

Requirements

Dataset Preparation

Training

The general command for training is,

python3 train.py

Change arguments for different experiments:

Remember to change dataset_root to suit your own case

The training loss and validation accuracy will be automatically saved in './logs/', which can be visualized with tensorboard. The model weights will be saved in './checkpoints'