Hyperparameter optimization: Difference between revisions

Content deleted Content added
Fix Google Vizier reference.
Specify Google Cloud Vizier Tuning Service.
Line 208:
* [https://aws.amazon.com/sagemaker/ Amazon Sagemaker] uses Gaussian processes to tune hyperparameters.
* [https://bigml.com/api/optimls BigML OptiML] supports mixed search domains
* [https://cloud.google.com/ml-engine/docs/tensorflow/using-hyperparameter-tuning Google HyperTuneCloud Vertex Vizier] supports mixed search domains, multiobjective, multifidelity, and safety constraints.
* [https://indiesolver.com Indie Solver] supports multiobjective, multifidelity and constraint optimization
* [https://mindfoundry.ai/OPTaaS Mind Foundry OPTaaS] supports mixed search domains, multiobjective, constraints, parallel optimization and surrogate models.