Home

Awesome

On-Device Domain Generalization

Overview

This repo contains the source code of our project, "On-Device Domain Generalization," which studies how to improve tiny neural networks' domain generalization (DG) performance, specifically for mobile DG applications. In the paper, we present a systematic study from which we find that knolwedge distillation outperforms commonly-used DG methods by a large margin under the on-device DG setting. We further propose a simple idea, called out-of-distribution knolwedge distillation (OKD), which extends KD by teaching the student how the teacher handles out-of-distribution data synthesized via data augmentations. We also provide a new suite of DG datasets, named DOSCO-2k, which are built on top of existing vision datasets (much more diverse than existing DG datasets) by synthesizing contextual domain shift using a neural network pretrained on the Places dataset.

Updates

Get Started

1. Setup

This code is built on top of the awesome toolbox, Dassl.pytorch, so you need to install the dassl environment first. Simply follow the instructions described here to install dassl as well as PyTorch. After that, run pip install -r requirements.txt under on-device-dg/ to install a few more packages (remember to activate the dassl environment via conda activate dassl before installing the new packages).

2. Datasets and Models

We suggest you download and put all datasets under the same folder, e.g., on-device-dg/data/.

3. Training

The running scripts are provided in on-device-dg/scripts/:

The DATA_ROOT argument is set to ./data/ by default. Feel free to change the path.

Below are the example commands used to reproduce the results on DOSCO-2k's P-Air using MobileNetV3-Small (should be run under on-device-dg/):

Important notes:

Citation

@article{zhou2022device,
  title={On-Device Domain Generalization},
  author={Zhou, Kaiyang and Zhang, Yuanhan and Zang, Yuhang and Yang, Jingkang and Loy, Chen Change and Liu, Ziwei},
  journal={arXiv preprint arXiv:2209.07521},
  year={2022}
}