Home

Awesome

<div align="center"> <img src="./docs/images/pipeline.jpg"/> </div>

Vega

English | 中文


Vega ver1.8.5 released


Introduction

Vega is an AutoML algorithm tool chain developed by Noah's Ark Laboratory, the main features are as follows:

  1. Full pipeline capabilities: The AutoML capabilities cover key functions such as Hyperparameter Optimization, Data Augmentation, Network Architecture Search (NAS), Model Compression, and Fully Train. These functions are highly decoupled and can be configured as required, construct a complete pipeline.
  2. Industry-leading AutoML algorithms: Provides Noah's Ark Laboratory's self-developed industry-leading algorithm (Benchmark) and Model Zoo to download the state-of-the-art (SOTA) models.
  3. Fine-grained network search space: The network search space can be freely defined, and rich network architecture parameters are provided for use in the search space. The network architecture parameters and model training hyperparameters can be searched at the same time, and the search space can be applied to Pytorch, TensorFlow and MindSpore.
  4. High-concurrency neural network training capability: Provides high-performance trainers to accelerate model training and evaluation.
  5. Multi-Backend: PyTorch (GPU and Ascend 910), TensorFlow (GPU and Ascend 910), MindSpore (Ascend 910).
  6. Ascend platform: Search and training on the Ascend 910 and model evaluation on the Ascend 310.

Algorithm List

CategoryAlgorithmDescriptionreference
NASCARS: Continuous Evolution for Efficient Neural Architecture SearchStructure Search Method of Multi-objective Efficient Neural Network Based on Continuous Evolutionref
NASModularNAS: Towards Modularized and Reusable Neural Architecture SearchA code library for various neural architecture search methods including weight sharing and network morphismref
NASMF-ASCMulti-Fidelity neural Architecture Search with Co-krigingref
NASNAGO: Neural Architecture Generator OptimizationAn Hierarchical Graph-based Neural Architecture Search Spaceref
NASSR-EAAn Automatic Network Architecture Search Method for Super Resolutionref
NASESR-EA: Efficient Residual Dense Block Search for Image Super-resolutionMulti-objective image super-resolution based on network architecture searchref
NASAdelaide-EA: SEGMENTATION-Adelaide-EA-NASNetwork Architecture Search Algorithm for Image Segmentationref
NASSP-NAS: Serial-to-Parallel Backbone Search for Object DetectionSerial-to-Parallel Backbone Search for Object Detection Efficient Search Algorithm for Object Detection and Semantic Segmentation in Trunk Network Architectureref
NASSM-NAS: Structural-to-Modular NASTwo-stage object detection architecture search algorithmComing soon
NASAuto-Lane: CurveLane-NASAn End-to-End Framework Search Algorithm for Lane Linesref
NASAutoFISAn automatic feature selection algorithm for recommender system scenesref
NASAutoGroupAn automatically learn feature interaction for recommender system scenesref
NASMF-ASCMulti-Fidelity neural Architecture Search with Co-krigingref
Model CompressionQuant-EA: Quantization based on Evolutionary AlgorithmAutomatic mixed bit quantization algorithm, using evolutionary strategy to quantize each layer of the CNN networkref
Model CompressionPrune-EAAutomatic channel pruning algorithm using evolutionary strategiesref
HPOASHA: Asynchronous Successive Halving AlgorithmDynamic continuous halving algorithmref
HPOBOHB: Hyperband with Bayesian OptimizationHyperband with Bayesian Optimizationref
HPOBOSS: Bayesian Optimization via Sub-SamplingA universal hyperparameter optimization algorithm based on Bayesian optimization framework for resource-constraint hyper-parameters searchref
Data AugmentationPBA: Population Based Augmentation: Efficient Learning of Augmentation Policy SchedulesData augmentation based on PBT optimizationref
Data AugmentationCycleSR: Unsupervised Image Super-Resolution with an Indirect Supervised PathUnsupervised style migration algorithm for low-level vision problem.ref
Fully TrainBeyond Dropout: Feature Map Distortion to Regularize Deep Neural NetworksNeural network training (regularization) based on disturbance of feature mapref
Fully TrainCircumventing Outliers of AutoAugment with Knowledge DistillationJoint knowledge distillation and data augmentation for high performance classication model training, achieved 85.8% Top-1 accuracy on ImageNet 1kComing soon

Installation

Run the following commands to install Vega:

pip3 install --user --upgrade noah-vega

Usage

Run the vega command to run the Vega application. For example, run the following command to run the CARS algorithm:

vega ./examples/nas/cars/cars.yml

The cars.yml file contains definitions such as pipeline, search algorithm, search space, and training parameters. Vega provides more than 40 examples for reference: Examples, Example Guide, and Configuration Guide.

The security mode is applicable to communication with high security requirements. Before running this command, run the security configuration.

vega ./examples/nas/cars/cars.yml -s

Reference

ReaderRefrence
UserInstall Guide, Deployment Guide, Configuration Guide, Security Configuration, Examples, Evaluate Service
DeveloperDevelopment Reference, Quick Start Guide, Dataset Guide, Algorithm Development Guide

FAQ

For common problems and exception handling, please refer to FAQ.

Citation

@misc{wang2020vega,
      title={VEGA: Towards an End-to-End Configurable AutoML Pipeline},
      author={Bochao Wang and Hang Xu and Jiajin Zhang and Chen Chen and Xiaozhi Fang and Ning Kang and Lanqing Hong and Wei Zhang and Yong Li and Zhicheng Liu and Zhenguo Li and Wenzhi Liu and Tong Zhang},
      year={2020},
      eprint={2011.01507},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

Cooperation and Contribution

Welcome to use Vega. If you have any questions or suggestions, need help, fix bugs, contribute new algorithms, or improve the documentation, submit an issue in the community. We will reply to and communicate with you in a timely manner.