Hyperparameter optimization: Difference between revisions

Content deleted Content added
Software: Add hyperopt to random search
Line 7:
=== Grid search ===
The traditional way of performing hyperparameter optimization has been ''grid search'', or a ''parameter sweep'', which is simply an [[Brute-force search|exhaustive searching]] through a manually specified subset of the hyperparameter space of a learning algorithm. A grid search algorithm must be guided by some performance metric, typically measured by [[Cross-validation (statistics)|cross-validation]] on the training set<ref>Chin-Wei Hsu, Chih-Chung Chang and Chih-Jen Lin (2010). [http://www.csie.ntu.edu.tw/~cjlin/papers/guide/guide.pdf A practical guide to support vector classification]. Technical Report, [[National Taiwan University]].</ref>
or evaluation on a held-out validation set.<ref>{{cite journal
| vauthors = Chicco D
| title = Ten quick tips for machine learning in computational biology
| journal = BioData Mining
| volume = 10
| issue = 35
| pages = 1-17
| date = December 2017
| pmid = 29234465
| doi = 10.1186/s13040-017-0155-3
| pmc= 5721660}}</ref>
 
 
Since the parameter space of a machine learner may include real-valued or unbounded value spaces for certain parameters, manually set bounds and discretization may be necessary before applying grid search.