docs and Elo ratings)
humpday derivative-free optimizers (Deriv-free optimizers from many packages in a common syntax, with evaluation
-
There's a colab notebook that recommends a black-box derivative-free optimizer for your objective function.
-
About fifty strategies drawn from various open source packages are assigned Elo ratings depending on dimension of the problem and number of function evaluations allowed.
Hello and welcome to HumpDay, a package that helps you choose a Python global optimizer package, and strategy therein, from Ax-Platform, bayesian-optimization, DLib, HyperOpt, NeverGrad, Optuna, Platypus, PyMoo, PySOT, Scipy classic and shgo, Skopt, nlopt, Py-Bobyaq, UltraOpt and maybe others by the time you read this. It also presents some of their functionality in a common calling syntax.
Cite or be cited
Pull requests at CITE.md are welcome. If your package is benchmarked here I'd like to get this bit right.
Install
See INSTALL.md
Short version:
pip install humpday
pip install humpday[full]
Recommendations
Pass the dimensions of the problem, function evaluation budget and time budget to receive suggestions that are independent of your problem set,
from pprint import pprint
from humpday import suggest
pprint(suggest(n_dim=5, n_trials=130,n_seconds=5*60))
where n_seconds is the total computation budget for the optimizer (not the objective function) over all 130 function evaluations. Or simply pass your objective function, and it will time it and do something sensible:
from humpday import recommend
def my_objective(u):
time.sleep(0.01)
return u[0]*math.sin(u[1])
recommendations = recommend(my_objective, n_dim=21, n_trials=130)
Points race
If you have more time, call points_race on a list of your own objective functions:
from humpday import points_race
points_race(objectives=[my_objective]*2,n_dim=5, n_trials=100)
See the colab notebook.
How it works
In the background, 50+ strategies are assigned Elo ratings by sister repo optimizer-elo-ratings. Oh I said that already. Never mind.
Contribute
By all means contribute more to optimizers.