Hyperparameter optimization: Difference between revisions

Content deleted Content added
m Various citation & identifier cleanup, plus AWB genfixes (arxiv version pointless when published)
Other: Updated link
Line 193:
* [[dlib]]<ref name=dlib_github>{{Cite web|url=https://github.com/davisking/dlib|title=A toolkit for making real world machine learning and data analysis applications in C++: davisking/dlib|date=February 25, 2019|via=GitHub}}</ref> is a C++ package with a Python API which has a parameter-free optimizer based on [https://arxiv.org/abs/1703.02628 LIPO] and [[trust region]] optimizers working in tandem.<ref name=dlib_blog>{{cite web |last1=King |first1=Davis |title=A Global Optimization Algorithm Worth Using |url=http://blog.dlib.net/2017/12/a-global-optimization-algorithm-worth.html}}</ref>
* [https://github.com/callowbird/Harmonica Harmonica] is a Python package for spectral hyperparameter optimization.<ref name=abs1706.00764/>
* [https://github.com/hyperopt/hyperopt hyperopt], also via [https://github.com/maxpumperla/hyperas hyperas] and [https://github.com/hyperopt/hyperopt-sklearn hyperopt-sklearn], are Python packages which include [[kernel density estimation|Treetree of Parzen Estimatorsestimators]] based distributed hyperparameter optimization.
*[https://github.com/kubeflow/katib/ Katib] is a Kubernetes-native system which includes grid, random search, bayesian optimization, hyperband, and NAS based on reinforcement learning.
* [https://github.com/facebookresearch/nevergrad nevergrad]<ref name=nevergrad_issue1>{{Cite web|url=https://github.com/facebookresearch/nevergrad/issues/1|title=[QUESTION] How to use to optimize NN hyperparameters · Issue #1 · facebookresearch/nevergrad|website=GitHub}}</ref> is a Python package for gradient-free optimization using techniques such as differential evolution, sequential quadratic programming, fastGA, covariance matrix adaptation, population control methods, and particle swarm optimization.<ref name=nevergrad>{{Cite web|url=https://code.fb.com/ai-research/nevergrad/|title=Nevergrad: An open source tool for derivative-free optimization|date=December 20, 2018}}</ref>