Home

Awesome

<p align="center"> <br> <a href="https://github.com/SimonBlanke/Gradient-Free-Optimizers"><img src="./docs/images/gradient_logo_ink.png" height="280"></a> <br> </p> <br>
<h2 align="center"> Simple and reliable optimization with local, global, population-based and sequential techniques in numerical discrete search spaces. </h2> <br> <table> <tbody> <tr align="left" valign="center"> <td> <strong>Master status:</strong> </td> <td> <a href="https://github.com/SimonBlanke/Gradient-Free-Optimizers/actions"> <img src="https://github.com/SimonBlanke/Gradient-Free-Optimizers/actions/workflows/tests_ubuntu.yml/badge.svg?branch=master" alt="img not loaded: try F5 :)"> </a> <a href="https://github.com/SimonBlanke/Gradient-Free-Optimizers/actions"> <img src="https://github.com/SimonBlanke/Gradient-Free-Optimizers/actions/workflows/tests_windows.yml/badge.svg?branch=master" alt="img not loaded: try F5 :)"> </a> <a href="https://github.com/SimonBlanke/Gradient-Free-Optimizers/actions"> <img src="https://github.com/SimonBlanke/Gradient-Free-Optimizers/actions/workflows/tests_macos.yml/badge.svg?branch=master" alt="img not loaded: try F5 :)"> </a> <a href="https://app.codecov.io/gh/SimonBlanke/Gradient-Free-Optimizers"> <img src="https://img.shields.io/codecov/c/github/SimonBlanke/Gradient-Free-Optimizers/master" alt="img not loaded: try F5 :)"> </a> </td> </tr> <tr align="left" valign="center"> <td> <strong>Dev status:</strong> </td> <td> <a href="https://github.com/SimonBlanke/Gradient-Free-Optimizers/actions"> <img src="https://github.com/SimonBlanke/Gradient-Free-Optimizers/actions/workflows/tests_ubuntu.yml/badge.svg?branch=dev" alt="img not loaded: try F5 :)"> </a> <a href="https://github.com/SimonBlanke/Gradient-Free-Optimizers/actions"> <img src="https://github.com/SimonBlanke/Gradient-Free-Optimizers/actions/workflows/tests_windows.yml/badge.svg?branch=dev" alt="img not loaded: try F5 :)"> </a> <a href="https://github.com/SimonBlanke/Gradient-Free-Optimizers/actions"> <img src="https://github.com/SimonBlanke/Gradient-Free-Optimizers/actions/workflows/tests_macos.yml/badge.svg?branch=dev" alt="img not loaded: try F5 :)"> </a> <a href="https://app.codecov.io/gh/SimonBlanke/Gradient-Free-Optimizers"> <img src="https://img.shields.io/codecov/c/github/SimonBlanke/Gradient-Free-Optimizers/dev" alt="img not loaded: try F5 :)"> </a> </td> </tr> <tr align="left" valign="center"> <td> <strong>Code quality:</strong> </td> <td> <a href="https://codeclimate.com/github/SimonBlanke/Gradient-Free-Optimizers"> <img src="https://img.shields.io/codeclimate/maintainability/SimonBlanke/Gradient-Free-Optimizers?style=flat-square&logo=code-climate" alt="img not loaded: try F5 :)"> </a> <a href="https://scrutinizer-ci.com/g/SimonBlanke/Gradient-Free-Optimizers/"> <img src="https://img.shields.io/scrutinizer/quality/g/SimonBlanke/Gradient-Free-Optimizers?style=flat-square&logo=scrutinizer-ci" alt="img not loaded: try F5 :)"> </a> </td> </tr> <tr align="left" valign="center"> <td> <strong>Latest versions:</strong> </td> <td> <a href="https://pypi.org/project/gradient_free_optimizers/"> <img src="https://img.shields.io/pypi/v/Gradient-Free-Optimizers?style=flat-square&logo=PyPi&logoColor=white&color=blue" alt="img not loaded: try F5 :)"> </a> </td> </tr> </tbody> </table> <br>

Introduction

Gradient-Free-Optimizers provides a collection of easy to use optimization techniques, whose objective function only requires an arbitrary score that gets maximized. This makes gradient-free methods capable of solving various optimization problems, including:

Gradient-Free-Optimizers is the optimization backend of <a href="https://github.com/SimonBlanke/Hyperactive">Hyperactive</a> (in v3.0.0 and higher) but it can also be used by itself as a leaner and simpler optimization toolkit.

<br>
<div align="center"><a name="menu"></a> <h3> <a href="https://github.com/SimonBlanke/Gradient-Free-Optimizers#optimization-algorithms">Optimization algorithms</a> • <a href="https://github.com/SimonBlanke/Gradient-Free-Optimizers#installation">Installation</a> • <a href="https://github.com/SimonBlanke/Gradient-Free-Optimizers#examples">Examples</a> • <a href="https://simonblanke.github.io/gradient-free-optimizers-documentation">API reference</a> • <a href="https://github.com/SimonBlanke/Gradient-Free-Optimizers#roadmap">Roadmap</a> </h3> </div>
<br>

Main features

<br>

Optimization algorithms:

Gradient-Free-Optimizers supports a variety of optimization algorithms, which can make choosing the right algorithm a tedious endeavor. The gifs in this section give a visual representation how the different optimization algorithms explore the search space and exploit the collected information about the search space for a convex and non-convex objective function. More detailed explanations of all optimization algorithms can be found in the official documentation.

<br>

Local Optimization

<details> <summary><b>Hill Climbing</b></summary> <br>

Evaluates the score of n neighbours in an epsilon environment and moves to the best one.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/hill_climbing_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/hill_climbing_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <details> <summary><b>Stochastic Hill Climbing</b></summary> <br>

Adds a probability to the hill climbing to move to a worse position in the search-space to escape local optima.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/stochastic_hill_climbing_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/stochastic_hill_climbing_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <details> <summary><b>Repulsing Hill Climbing</b></summary> <br>

Hill climbing algorithm with the addition of increasing epsilon by a factor if no better neighbour was found.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/repulsing_hill_climbing_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/repulsing_hill_climbing_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <details> <summary><b>Simulated Annealing</b></summary> <br>

Adds a probability to the hill climbing to move to a worse position in the search-space to escape local optima with decreasing probability over time.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/simulated_annealing_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/simulated_annealing_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <details> <summary><b>Downhill Simplex Optimization</b></summary> <br>

Constructs a simplex from multiple positions that moves through the search-space by reflecting, expanding, contracting or shrinking.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/downhill_simplex_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/downhill_simplex_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <br>

Global Optimization

<details> <summary><b>Random Search</b></summary> <br>

Moves to random positions in each iteration.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/random_search_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/random_search_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <details> <summary><b>Grid Search</b></summary> <br>

Grid-search that moves through search-space diagonal (with step-size=1) starting from a corner.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/grid_search_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/grid_search_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <details> <summary><b>Random Restart Hill Climbing</b></summary> <br>

Hill climbingm, that moves to a random position after n iterations.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/random_restart_hill_climbing_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/random_restart_hill_climbing_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <details> <summary><b>Random Annealing</b></summary> <br>

Hill Climbing, that has large epsilon at the start of the search decreasing over time.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/random_annealing_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/random_annealing_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <details> <summary><b>Pattern Search</b></summary> <br>

Creates cross-shaped collection of positions that move through search-space by moving as a whole towards optima or shrinking the cross.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/pattern_search_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/pattern_search_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <details> <summary><b>Powell's Method</b></summary> <br>

Optimizes each search-space dimension at a time with a hill-climbing algorithm.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/powells_method_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/powells_method_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <br>

Population-Based Optimization

<details> <summary><b>Parallel Tempering</b></summary> <br>

Population of n simulated annealers, which occasionally swap transition probabilities.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/parallel_tempering_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/parallel_tempering_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <details> <summary><b>Particle Swarm Optimization</b></summary> <br>

Population of n particles attracting each other and moving towards the best particle.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/particle_swarm_optimization_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/particle_swarm_optimization_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <details> <summary><b>Spiral Optimization</b></summary> <br>

Population of n particles moving in a spiral pattern around the best position.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/spiral_optimization_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/spiral_optimization_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <details> <summary><b>Genetic Algorithm</b></summary> <br>

Evolutionary algorithm selecting the best individuals in the population, mixing their parameters to get new solutions.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/genetic_algorithm_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/genetic_algorithm_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <details> <summary><b>Evolution Strategy</b></summary> <br>

Population of n hill climbers occasionally mixing positional information and removing worst positions from population.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/evolution_strategy_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/evolution_strategy_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <details> <summary><b>Differential Evolution</b></summary> <br>

Improves a population of candidate solutions by creating trial vectors through the differential mutation of three randomly selected individuals.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/differential_evolution_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/differential_evolution_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <br>

Sequential Model-Based Optimization

<details> <summary><b>Bayesian Optimization</b></summary> <br>

Gaussian process fitting to explored positions and predicting promising new positions.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/bayesian_optimization_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/bayesian_optimization_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <details> <summary><b>Lipschitz Optimization</b></summary> <br>

Calculates an upper bound from the distances of the previously explored positions to find new promising positions.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/lipschitz_optimizer_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/lipschitz_optimizer_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <details> <summary><b>DIRECT algorithm</b></summary> <br>

Separates search space into subspaces. It evaluates the center position of each subspace to decide which subspace to sepate further.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/direct_algorithm_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/direct_algorithm_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <details> <summary><b>Tree of Parzen Estimators</b></summary> <br>

Kernel density estimators fitting to good and bad explored positions and predicting promising new positions.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/tree_structured_parzen_estimators_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/tree_structured_parzen_estimators_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <details> <summary><b>Forest Optimizer</b></summary> <br>

Ensemble of decision trees fitting to explored positions and predicting promising new positions.

<br> <table style="width:100%"> <tr> <th> <b>Convex Function</b> </th> <th> <b>Non-convex Function</b> </th> </tr> <tr> <td> <img src="./docs/gifs/forest_optimization_sphere_function_.gif" width="100%"> </td> <td> <img src="./docs/gifs/forest_optimization_ackley_function_.gif" width="100%"> </td> </tr> </table> </details> <br>

Sideprojects and Tools

The following packages are designed to support Gradient-Free-Optimizers and expand its use cases.

PackageDescription
Search-Data-CollectorSimple tool to save search-data during or after the optimization run into csv-files.
Search-Data-ExplorerVisualize search-data with plotly inside a streamlit dashboard.

If you want news about Gradient-Free-Optimizers and related projects you can follow me on twitter.

<br>

Installation

PyPI version

The most recent version of Gradient-Free-Optimizers is available on PyPi:

pip install gradient-free-optimizers
<br>

Examples

<details> <summary><b>Convex function</b></summary>
import numpy as np
from gradient_free_optimizers import RandomSearchOptimizer


def parabola_function(para):
    loss = para["x"] * para["x"]
    return -loss


search_space = {"x": np.arange(-10, 10, 0.1)}

opt = RandomSearchOptimizer(search_space)
opt.search(parabola_function, n_iter=100000)
</details> <details> <summary><b>Non-convex function</b></summary>
import numpy as np
from gradient_free_optimizers import RandomSearchOptimizer


def ackley_function(pos_new):
    x = pos_new["x1"]
    y = pos_new["x2"]

    a1 = -20 * np.exp(-0.2 * np.sqrt(0.5 * (x * x + y * y)))
    a2 = -np.exp(0.5 * (np.cos(2 * np.pi * x) + np.cos(2 * np.pi * y)))
    score = a1 + a2 + 20
    return -score


search_space = {
    "x1": np.arange(-100, 101, 0.1),
    "x2": np.arange(-100, 101, 0.1),
}

opt = RandomSearchOptimizer(search_space)
opt.search(ackley_function, n_iter=30000)
</details> <details> <summary><b>Machine learning example</b></summary>
import numpy as np
from sklearn.model_selection import cross_val_score
from sklearn.ensemble import GradientBoostingClassifier
from sklearn.datasets import load_wine

from gradient_free_optimizers import HillClimbingOptimizer


data = load_wine()
X, y = data.data, data.target


def model(para):
    gbc = GradientBoostingClassifier(
        n_estimators=para["n_estimators"],
        max_depth=para["max_depth"],
        min_samples_split=para["min_samples_split"],
        min_samples_leaf=para["min_samples_leaf"],
    )
    scores = cross_val_score(gbc, X, y, cv=3)

    return scores.mean()


search_space = {
    "n_estimators": np.arange(20, 120, 1),
    "max_depth": np.arange(2, 12, 1),
    "min_samples_split": np.arange(2, 12, 1),
    "min_samples_leaf": np.arange(1, 12, 1),
}

opt = HillClimbingOptimizer(search_space)
opt.search(model, n_iter=50)
</details> <details> <summary><b>Constrained Optimization example</b></summary>
import numpy as np
from gradient_free_optimizers import RandomSearchOptimizer


def convex_function(pos_new):
    score = -(pos_new["x1"] * pos_new["x1"] + pos_new["x2"] * pos_new["x2"])
    return score


search_space = {
    "x1": np.arange(-100, 101, 0.1),
    "x2": np.arange(-100, 101, 0.1),
}


def constraint_1(para):
    # only values in 'x1' higher than -5 are valid
    return para["x1"] > -5


# put one or more constraints inside a list
constraints_list = [constraint_1]


# pass list of constraints to the optimizer
opt = RandomSearchOptimizer(search_space, constraints=constraints_list)
opt.search(convex_function, n_iter=50)

search_data = opt.search_data

# the search-data does not contain any samples where x1 is equal or below -5
print("\n search_data \n", search_data, "\n")
</details> <br>

Roadmap

<details> <summary><b>v0.3.0</b> :heavy_check_mark:</summary> </details> <details> <summary><b>v0.4.0</b> :heavy_check_mark:</summary> </details> <details> <summary><b>v0.5.0</b> :heavy_check_mark:</summary> </details> <details> <summary><b>v1.0.0</b> :heavy_check_mark:</summary> </details> <details> <summary><b>v1.1.0</b> :heavy_check_mark:</summary> </details> <details> <summary><b>v1.2.0</b> :heavy_check_mark:</summary> </details> <details> <summary><b>v1.3.0</b> :heavy_check_mark:</summary> </details> <details> <summary><b>v1.4.0</b> :heavy_check_mark:</summary> </details> <details> <summary><b>v1.5.0</b> :heavy_check_mark:</summary> </details> <details> <summary><b>v1.6.0</b> :heavy_check_mark:</summary> </details> <details> <summary><b>Future releases</b> </summary> </details> <br>

Gradient Free Optimizers <=> Hyperactive

Gradient-Free-Optimizers was created as the optimization backend of the Hyperactive package. Therefore the algorithms are exactly the same in both packages and deliver the same results. However you can still use Gradient-Free-Optimizers as a standalone package. The separation of Gradient-Free-Optimizers from Hyperactive enables multiple advantages:

While Gradient-Free-Optimizers is relatively simple, Hyperactive is a more complex project with additional features to make optimization of computationally expensive models (like engineering simulation or machine-/deep-learning models) more convenient.

<br>

Citation

@Misc{gfo2020,
  author =   {{Simon Blanke}},
  title =    {{Gradient-Free-Optimizers}: Simple and reliable optimization with local, global, population-based and sequential techniques in numerical search spaces.},
  howpublished = {\url{https://github.com/SimonBlanke}},
  year = {since 2020}
}
<br>

License

Gradient-Free-Optimizers is licensed under the following License:

LICENSE