Description
BayesOpt is an efficient implementation of the Bayesian optimization methodology for nonlinear-optimization, experimental design, stochastic bandits and hyperparameter tunning. In the literature it is also called Sequential Kriging Optimization (SKO), Sequential Model-Based Optimization (SMBO) or Efficient Global Optimization (EGO).
Bayesian optimization uses a distribution over functions to build a model of the unknown function for we are looking the extrema, and then apply some active learning strategy to select the query points that provides most potential interest or improvement. Thus, it is a sampling efficient method for nonlinear optimization, design of experiments or bandits-like problems.
BayesOpt alternatives and similar libraries
Based on the "Artificial Intelligence" category.
Alternatively, view BayesOpt alternatives based on common mentions on social networks and blogs.
-
Eclipse Deeplearning4J
Suite of tools for deploying and training deep learning models using the JVM. Highlights include model import for keras, tensorflow, and onnx/pytorch, a modular and tiny c++ library for running math code and a java based math library on top of the core c++ library. Also includes samediff: a pytorch/tensorflow like library for running deep learning using automatic differentiation. -
Modern C++ framework for Symbolic Regression
Modern C++ framework for symbolic regression that uses genetic programming to explore a hypothesis space of possible mathematical expression. -
Evolving Objects
A template-based, ANSI-C++ evolutionary computation library which helps you to write your own stochastic optimization algorithms insanely fast. [LGPL]
WorkOS - The modern identity platform for B2B SaaS
* Code Quality Rankings and insights are calculated and provided by Lumnify.
They vary from L1 to L5 with "L5" being the highest.
Do you think we are missing an alternative of BayesOpt or a related project?
README
BayesOpt: A Bayesian optimization library
BayesOpt is an efficient implementation of the Bayesian optimization methodology for nonlinear optimization, experimental design and hyperparameter tunning.
Bayesian optimization uses a distribution over functions to build a surrogate model of the unknown function for we are looking the optimum, and then apply some active learning strategy to select the query points that provides most potential interest or improvement. Thus, it is a sample efficient method for nonlinear optimization, design of experiments and simulations or bandits-like problems. Currently, it is being used in many scientific and industrial applications. In the literature it is also called Sequential Kriging Optimization (SKO), Sequential Model-Based Optimization (SMBO) or Efficient Global Optimization (EGO).
BayesOpt is licensed under the AGPL and it is free to use. However, if you use BayesOpt in a work that leads to a scientific publication, we would appreciate it if you would kindly cite BayesOpt in your manuscript.
Ruben Martinez-Cantin, BayesOpt: A Bayesian Optimization Library for Nonlinear Optimization, Experimental Design and Bandits. Journal of Machine Learning Research, 15(Nov):3735--3739, 2014.
The paper can be found at http://jmlr.org/papers/v15/martinezcantin14a.html
Commercial applications may also acquire a commercial license. Please contact [email protected] for details.
Getting and installing BayesOpt
The library can be download from Github: https://github.com/rmcantin/bayesopt
You can also get the cutting-edge version from the repositories:
>> git clone https://github.com/rmcantin/bayesopt
The online documentation can be found at: http://rmcantin.github.io/bayesopt/html/ where it includes a install guide.
Questions and issues
- The best place to ask questions and discuss about BayesOpt is the bayesopt-discussion mailing list.
- Please file bug reports or suggestions at: https://bitbucket.org/rmcantin/bayesopt/issues or https://github.com/rmcantin/bayesopt/issues
- Alternatively, you may directly contact Ruben Martinez-Cantin [email protected].
Copyright (C) 2011-2020 Ruben Martinez-Cantin [email protected]
BayesOpt is free software: you can redistribute it and/or modify it under the terms of the GNU Affero General Public License as published by the Free Software Foundation, version 3 of the License.
BayesOpt is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Affero General Public License for more details.
You should have received a copy of the GNU Affero General Public License along with BayesOpt. If not, see http://www.gnu.org/licenses/.
*Note that all licence references and agreements mentioned in the BayesOpt README section above
are relevant to that project's source code only.