Home

Awesome

Learning Where To Look - Generative NAS is Surprisingly Efficient [PDF]

Jovita Lukasik, Steffen Jung, Margret Keuper

Generative Model using Latent Space Optimization

Installation

Clone this repo and install requirements:

pip install -r requirements.txt

Also needed:

Usage

Preliminary

Define directory path in Settings.py

Generation

bash scripts/Train_G_NB101.sh
bash scripts/Train_G_NB201.sh
bash scripts/Train_G_NBNLP.sh
bash scripts/Train_G_NB301.sh

To train the generator model in the NAS-Bench-301 search space first run datasets/NASBench301/create_random_data.py to generate 500 k random data. The pretrained genation model state dicts are in state_dicts\

Search using AG-Net on CIFAR

bash scripts/Search_NB101.sh 
bash scripts/Search_NB201.sh 
bash scripts/Search_NB301.sh 
bash scripts/Search_NBNLP.sh 
bash scripts/Search_HW.sh 

Search on ImageNet

Follow TENAS for initial steps and architecture evaluations

bash scripts/Search_TENAS.sh

Search using XGB

bash scripts/Search_NB101_XGB_XGBranking.sh

Citation



@article{lukasik2022,
  author    = {Jovita Lukasik and
               Steffen Jung and
               Margret Keuper},
  title     = {Learning Where To Look - Generative {NAS} is Surprisingly Efficient},
  journal   = {CoRR},
  volume    = {abs/2203.08734},
  year      = {2022},
}

Acknowledgement

Code base from