Home

Awesome

MetaNTK-NAS: Global Convergence of MAML and Theory-Inspired Neural Architecture Search for Few-Shot Learning Paper

Haoxiang Wang*, Yite Wang*, Ruoyu Sun, Bo Li

In CVPR 2022.

If you find this repo useful for your research, please consider citing our paper

@inproceedings{MetaNTK-NAS,
  title={Global Convergence of MAML and Theory-Inspired Neural Architecture Search for Few-Shot Learning},
  author={Wang, Haoxiang and Wang, Yite and Sun, Ruoyu and Li, Bo},
  booktitle={CVPR},
  year={2022}
}

Overview

This is the PyTorch implementation of MetaNTK-NAS, a training-free NAS method for few-shot learning based on Meta Neural Tangent Kernels (MetaNTK).

Installation

This repository has been tested with RedHat with Pytorch 1.3.1 on NVIDIA V100 GPUs and Ubuntu with Pytorch 1.10 on GTX 3090 and NVIDIA V100 GPUs. For other platforms, configurations may need to be changed.

Required packages

Or you can simply install all dependencies using:

pip install -r requirements.txt

Usage

0. Prepare the dataset

1. Search

DARTS_fewshot Space

You may want to check the sample scripts in scripts folder. It will call prune_lanch.py with predefined configurations. Here are multiple arguments you might want to modify to replicate our experiment results.

You may also directly call prune_metantknas.py, there you will have much more flexibility. Check the file for more details.

2. Evaluation

To-Do

Acknowledgement