Home

Awesome

Data-Free-NAS

This is the pytorch implementation of our paper "Data-Free Neural Architecture Search via Recursive Label Calibration", published in ECCV 2022.

<div align=center> <img width=60% src="https://github.com/liuzechun0216/images/blob/master/data-free_NAS_github.jpg"/> </div>

This paper aims to explore the feasibility of neural architecture search (NAS) without original data, given only a pre-trained model. Our results demonstrate that the architectures discovered by our data-free NAS achieve comparable accuracy as architectures searched from the original natural data. This derives the conclusion that NAS can be done effectively and data-freely.

Citation

If you find our code useful for your research, please consider citing:

@inproceedings{liu2022data,
  title={Data-Free Neural Architecture Search via Recursive Label Calibration},
  author={Liu, Zechun and Shen, Zhiqiang and Long, Yun and Xing, Eric and Cheng, Kwang-Ting and Leichner, Chas},
  booktitle={European Conference on Computer Vision (ECCV)},
  year={2022}
}

Run

1. Requirements:

2. Steps to run:

(1) Step 1: image synthesis

(2) Step 2: neural architecture search

Step 2.0: split the synthesized data into the training set for supernet training and validation set for evolutionary search

Step 2.1: supernet training

Step 2.2: evolutionary search

Step 2.3: evaluation

Models and synthesized data

1. Pretrained ResNet-50 model: ResNet-50

2. Synthesized images: Data

3. Searched model and final results:

MethodsTop1-Err (%)FLOPs (M)Data for NAS
Single Path One-Shot (SPOS)25.7319ImageNet
Data-Free SPOS25.8316Synthesized data

Contact

Zechun Liu, HKUST (zliubq at connect.ust.hk)