Home

Awesome

<h1 align="center"> <img src="./docs/images/logo-title.png" alt="logo" width="200"> <br> </h1>

Build Status Coverage Status Go Report Card Releases Slack Status

Katib is a Kubernetes-native project for automated machine learning (AutoML). Katib supports Hyperparameter Tuning, Early Stopping and Neural Architecture Search.

Katib is the project which is agnostic to machine learning (ML) frameworks. It can tune hyperparameters of applications written in any language of the users’ choice and natively supports many ML frameworks, such as TensorFlow, PyTorch, XGBoost, and others.

Katib can perform training jobs using any Kubernetes Custom Resources with out of the box support for Kubeflow Training Operator, Argo Workflows, Tekton Pipelines and many more.

Katib stands for secretary in Arabic.

Search Algorithms

Katib supports several search algorithms. Follow the Kubeflow documentation to know more about each algorithm and check the this guide to implement your custom algorithm.

<table> <tbody> <tr align="center"> <td> <b>Hyperparameter Tuning</b> </td> <td> <b>Neural Architecture Search</b> </td> <td> <b>Early Stopping</b> </td> </tr> <tr align="center"> <td> <a href="https://www.kubeflow.org/docs/components/katib/experiment/#random-search">Random Search</a> </td> <td> <a href="https://www.kubeflow.org/docs/components/katib/experiment/#neural-architecture-search-based-on-enas">ENAS</a> </td> <td> <a href="https://www.kubeflow.org/docs/components/katib/early-stopping/#median-stopping-rule">Median Stop</a> </td> </tr> <tr align="center"> <td> <a href="https://www.kubeflow.org/docs/components/katib/experiment/#grid-search">Grid Search</a> </td> <td> <a href="https://www.kubeflow.org/docs/components/katib/experiment/#differentiable-architecture-search-darts">DARTS</a> </td> <td> </td> </tr> <tr align="center"> <td> <a href="https://www.kubeflow.org/docs/components/katib/experiment/#bayesian-optimization">Bayesian Optimization</a> </td> <td> </td> <td> </td> </tr> <tr align="center"> <td> <a href="https://www.kubeflow.org/docs/components/katib/experiment/#tree-of-parzen-estimators-tpe">TPE</a> </td> <td> </td> <td> </td> </tr> <tr align="center"> <td> <a href="https://www.kubeflow.org/docs/components/katib/experiment/#multivariate-tpe">Multivariate TPE</a> </td> <td> </td> <td> </td> </tr> <tr align="center"> <td> <a href="https://www.kubeflow.org/docs/components/katib/experiment/#covariance-matrix-adaptation-evolution-strategy-cma-es">CMA-ES</a> </td> <td> </td> <td> </td> </tr> <tr align="center"> <td> <a href="https://www.kubeflow.org/docs/components/katib/experiment/#sobols-quasirandom-sequence">Sobol's Quasirandom Sequence</a> </td> <td> </td> <td> </td> </tr> <tr align="center"> <td> <a href="https://www.kubeflow.org/docs/components/katib/experiment/#hyperband">HyperBand</a> </td> <td> </td> <td> </td> </tr> <tr align="center"> <td> <a href="https://www.kubeflow.org/docs/components/katib/experiment/#pbt">Population Based Training</a> </td> <td> </td> <td> </td> </tr> </tbody> </table>

To perform the above algorithms Katib supports the following frameworks:

Prerequisites

Please check the official Kubeflow documentation for prerequisites to install Katib.

Installation

Please follow the Kubeflow Katib guide for the detailed instructions on how to install Katib.

Installing the Control Plane

Run the following command to install the latest stable release of Katib control plane:

kubectl apply -k "github.com/kubeflow/katib.git/manifests/v1beta1/installs/katib-standalone?ref=v0.17.0"

Run the following command to install the latest changes of Katib control plane:

kubectl apply -k "github.com/kubeflow/katib.git/manifests/v1beta1/installs/katib-standalone?ref=master"

For the Katib Experiments check the complete examples list.

Installing the Python SDK

Katib implements a Python SDK to simplify creation of hyperparameter tuning jobs for Data Scientists.

Run the following command to install the latest stable release of Katib SDK:

pip install -U kubeflow-katib

Getting Started

Please refer to the getting started guide to quickly create your first hyperparameter tuning Experiment using the Python SDK.

Community

The following links provide information on how to get involved in the community:

Contributing

Please refer to the CONTRIBUTING guide.

Citation

If you use Katib in a scientific publication, we would appreciate citations to the following paper:

A Scalable and Cloud-Native Hyperparameter Tuning System, George et al., arXiv:2006.02085, 2020.

Bibtex entry:

@misc{george2020katib,
    title={A Scalable and Cloud-Native Hyperparameter Tuning System},
    author={Johnu George and Ce Gao and Richard Liu and Hou Gang Liu and Yuan Tang and Ramdoot Pydipaty and Amit Kumar Saha},
    year={2020},
    eprint={2006.02085},
    archivePrefix={arXiv},
    primaryClass={cs.DC}
}