July 29th, 2024

New release of Gradient-Free-Optimizers with two new evolutionary algorithms

The Gradient-Free-Optimizers GitHub repository offers various non-gradient optimization techniques, a user-friendly API, and supports multiple algorithms, including Bayesian optimization, with easy installation and comprehensive documentation.

Read original articleLink Icon
New release of Gradient-Free-Optimizers with two new evolutionary algorithms

The GitHub repository Gradient-Free-Optimizers offers a collection of optimization techniques that do not depend on gradients, suitable for numerical discrete search spaces. It features a user-friendly API for defining objective functions and search spaces, and incorporates modern optimization methods like Bayesian optimization, which are particularly effective for expensive objective functions. The library is rigorously tested, boasting over 400 tests to validate the performance of its algorithms.

The repository supports various optimization algorithms categorized into four main types: local optimization methods such as Hill Climbing and Simulated Annealing; global optimization techniques including Random Search and Grid Search; population-based methods like Particle Swarm Optimization and Genetic Algorithms; and sequential model-based optimization strategies such as Bayesian Optimization and the DIRECT algorithm.

Installation of the package is straightforward via pip, using the command "pip install gradient-free-optimizers." Additionally, the repository provides examples demonstrating the optimization of both convex and non-convex functions, as well as the optimization of machine learning hyperparameters. For further details, users can access the official documentation linked within the repository.

Link Icon 7 comments
By @Matumio - 6 months
When I read "evolution strategy" I was pretty sure to find some variant of the Canonical Evolution Strategy (as in https://arxiv.org/pdf/1802.08842) or maybe CMA-ES or something related. But the implementation looks like a GA. Maybe the term means different things to different people...?
By @spencerchubb - 6 months
the title mentions two new evolutionary algorithms, and I think it would be good if the title clarifies which two are new. it seems the new ones are the genetic algorithm and differential evolution

i find it interesting that Gradient-Free-Optimizers is used in a library for hyperparameter optimization. so in essence using a gradient-free approach to optimize a gradient-based approach

By @528491 - 6 months
The new release adds the Genetic Algorithm and Differential Evolution. Also check out the documentation for the new optimization algorithms: https://simonblanke.github.io/gradient-free-optimizers-docum...
By @ris - 6 months
I've previously used pagmo2 for this kind of thing with some amount of success. Might be worth giving this one a try as pagmo2's c++ patterns can be something of a mindfuck.
By @jglamine - 6 months
Are there any advantages of this over scipy.optimize? That's what I've used in the past. Trying to understand if it's worth switching.

It looks like they have many of the same algorithms.