Home

Awesome

Deep Transfer Learning in PyTorch

MIT License

This is a PyTorch library for deep transfer learning. We divide the code into two aspects: Single-source Unsupervised Domain Adaptation (SUDA) and Multi-source Unsupervised Domain Adaptation (MUDA). There are many SUDA methods, however I find there is a few MUDA methods with deep learning. Besides, MUDA with deep learning might be a more promising direction for domain adaptation.

Here I have implemented some deep transfer methods as follows:

Results on Office31(UDA)

MethodA - WD - WW - DA - DD - AW - AAverage
ResNet68.4±0.596.7±0.599.3±0.168.9±0.262.5±0.360.7±0.376.1
DDC75.8±0.295.0±0.298.2±0.177.5±0.367.4±0.464.0±0.579.7
DDC*78.3±0.497.1±0.1100.0±0.081.7±0.965.2±0.665.1±0.481.2
DAN83.8±0.496.8±0.299.5±0.178.4±0.266.7±0.362.7±0.281.3
DAN*82.6±0.797.7±0.1100.0±0.083.1±0.966.8±0.366.6±0.482.8
DCORAL*79.0±0.598.0±0.2100.0±0.082.7±0.165.3±0.364.5±0.381.6
Revgrad82.0±0.496.9±0.299.1±0.179.7±0.468.2±0.467.4±0.582.2
Revgrad*82.6±0.997.8±0.2100.0±0.083.3±0.966.8±0.166.1±0.582.8
MRAN91.4±0.196.9±0.399.8±0.286.4±0.668.3±0.570.9±0.685.6
DSAN93.6±0.298.4±0.1100.0±0.090.2±0.773.5±0.574.8±0.488.4

Note that the results without '*' comes from paper. The results with '*' are run by myself with the code.

Results on Office31(MUDA)

StandardsMethodA,W - DA,D - WD,W - AAverage
ResNet99.396.762.586.2
DAN99.596.866.787.7
Single BestDCORAL99.798.065.387.7
RevGrad99.196.968.288.1
DAN99.697.867.688.3
Source CombineDCORAL99.398.067.188.1
RevGrad99.798.167.688.5
Multi-SourceMFSAN99.598.572.790.2

Results on OfficeHome(MUDA)

StandardsMethodC,P,R - AA,P,R - CA,C,R - PA,C,P - RAverage
ResNet65.349.679.775.467.5
DAN64.150.878.275.067.0
Single BestDCORAL68.256.580.375.970.2
RevGrad67.955.980.475.870.0
DAN68.559.479.082.572.4
Source CombineDCORAL68.158.679.582.772.2
RevGrad68.459.179.582.772.4
Multi-SourceMFSAN72.162.080.381.874.1

Note that (1) Source combine: all source domains are combined together into a traditional single-source v.s. target setting. (2) Single best: among the multiple source domains, we report the best single source transfer results. (3) Multi-source: the results of MUDA methods.

Note

If you find that your accuracy is 100%, the problem might be the dataset folder. Please note that the folder structure required for the data provider to work is:

-dataset
    -amazon
    -webcam
    -dslr

Contact

If you have any problem about this library, please create an Issue or send us an Email at:

Reference

If you use this repository, please cite the following papers:

@inproceedings{zhu2019aligning,
  title={Aligning domain-specific distribution and classifier for cross-domain classification from multiple sources},
  author={Zhu, Yongchun and Zhuang, Fuzhen and Wang, Deqing},
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
  volume={33},
  pages={5989--5996},
  year={2019}
}
@article{zhu2020deep,
  title={Deep subdomain adaptation network for image classification},
  author={Zhu, Yongchun and Zhuang, Fuzhen and Wang, Jindong and Ke, Guolin and Chen, Jingwu and Bian, Jiang and Xiong, Hui and He, Qing},
  journal={IEEE transactions on neural networks and learning systems},
  volume={32},
  number={4},
  pages={1713--1722},
  year={2020},
  publisher={IEEE}
}
@article{zhu2019multi,
  title={Multi-representation adaptation network for cross-domain image classification},
  author={Zhu, Yongchun and Zhuang, Fuzhen and Wang, Jindong and Chen, Jingwu and Shi, Zhiping and Wu, Wenjuan and He, Qing},
  journal={Neural Networks},
  volume={119},
  pages={214--221},
  year={2019},
  publisher={Elsevier}
}