Home

Awesome

Introduction

PC-DARTS has been accepted for spotlight presentation at ICLR 2020!

PC-DARTS is a memory-efficient differentiable architecture method based on DARTS. It mainly focuses on reducing the large memory cost of the super-net in one-shot NAS method, which means that it can also be combined with other one-shot NAS method e.g. ENAS. Different from previous methods that sampling operations, PC-DARTS samples channels of the constructed super-net. Interestingly, though we introduced randomness during the search process, the performance of the searched architecture is better and more stable than DARTS! For a detailed description of technical details and experimental results, please refer to our paper:

Partial Channel Connections for Memory-Efficient Differentiable Architecture Search

Yuhui Xu, Lingxi Xie, Xiaopeng Zhang, Xin Chen, Guo-Jun Qi, Qi Tian and Hongkai Xiong.

This code is based on the implementation of DARTS.

Updates

Results

Results on CIFAR10

MethodParams(M)Error(%)Search-Cost
AmoebaNet-B2.82.553150
DARTSV13.33.000.4
DARTSV23.32.761.0
SNAS2.82.851.5
PC-DARTS3.62.570.1

Only 0.1 GPU-days are used for a search on CIFAR-10!

Results on ImageNet

MethodFLOPsTop-1 Error(%)Top-5 Error(%)Search-Cost
NASNet-A56426.08.41800
AmoebaNet-B57024.37.63150
PNAS58825.88.1225
DARTSV257426.78.71.0
SNAS52227.39.31.5
PC-DARTS59724.27.33.8

Search a good arcitecture on ImageNet by using the search space of DARTS(First Time!).

Usage

Search on CIFAR10

To run our code, you only need one Nvidia 1080ti(11G memory).

python train_search.py \\

Search on ImageNet

Data preparation: 10% and 2.5% images need to be random sampled prior from earch class of trainingset as train and val, respectively. The sampled data is save into ./imagenet_search. Note that not to use torch.utils.data.sampler.SubsetRandomSampler for data sampling as imagenet is too large.

python train_search_imagenet.py \\
       --tmp_data_dir /path/to/your/sampled/data \\
       --save log_path \\

The evaluation process simply follows that of DARTS.

Here is the evaluation on CIFAR10:
python train.py \\
       --auxiliary \\
       --cutout \\
Here is the evaluation on ImageNet (mobile setting):
python train_imagenet.py \\
       --tmp_data_dir /path/to/your/data \\
       --save log_path \\
       --auxiliary \\
       --note note_of_this_run

Pretrained models

Coming soon!.

Notes

    try:
      input_search, target_search = next(valid_queue_iter)
    except:
      valid_queue_iter = iter(valid_queue)
      input_search, target_search = next(valid_queue_iter)

Related work

Progressive Differentiable Architecture Search

Differentiable Architecture Search

Reference

If you use our code in your research, please cite our paper accordingly.

@inproceedings{
xu2020pcdarts,
title={{\{}PC{\}}-{\{}DARTS{\}}: Partial Channel Connections for Memory-Efficient Architecture Search},
author={Yuhui Xu and Lingxi Xie and Xiaopeng Zhang and Xin Chen and Guo-Jun Qi and Qi Tian and Hongkai Xiong},
booktitle={International Conference on Learning Representations},
year={2020},
url={https://openreview.net/forum?id=BJlS634tPr}
}