We describe the optimization algorithm implemented in the open-source derivative-free solver RBFOpt. The algorithm is based on the radial basis function method of Gutmann and the metric stochastic response surface method of Regis and Shoemaker. We propose several modifications aimed at generalizing and improving these two algorithms: (i) the use of an extended space to represent categorical variables in unary encoding; (ii) a refinement phase to locally improve a candidate solution; (iii) interpolation models without the unisolvence condition, to both help deal with categorical variables, and initiate the optimization before a uniquely determined model is possible; (iv) a master-worker framework to allow asynchronous objective function evaluations in parallel. Numerical experiments show the effectiveness of these ideas.
Revised:
Accepted:
Published online:
@article{OJMO_2021__2__A1_0, author = {Giacomo Nannicini}, title = {On the implementation of a global optimization method for mixed-variable problems}, journal = {Open Journal of Mathematical Optimization}, eid = {1}, pages = {1--25}, publisher = {Universit\'e de Montpellier}, volume = {2}, year = {2021}, doi = {10.5802/ojmo.3}, language = {en}, url = {https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.3/} }
TY - JOUR AU - Giacomo Nannicini TI - On the implementation of a global optimization method for mixed-variable problems JO - Open Journal of Mathematical Optimization PY - 2021 SP - 1 EP - 25 VL - 2 PB - Université de Montpellier UR - https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.3/ DO - 10.5802/ojmo.3 LA - en ID - OJMO_2021__2__A1_0 ER -
%0 Journal Article %A Giacomo Nannicini %T On the implementation of a global optimization method for mixed-variable problems %J Open Journal of Mathematical Optimization %D 2021 %P 1-25 %V 2 %I Université de Montpellier %U https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.3/ %R 10.5802/ojmo.3 %G en %F OJMO_2021__2__A1_0
Giacomo Nannicini. On the implementation of a global optimization method for mixed-variable problems. Open Journal of Mathematical Optimization, Volume 2 (2021), article no. 1, 25 p. doi : 10.5802/ojmo.3. https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.3/
[1] Optuna: A Next-generation Hyperparameter Optimization Framework, Proceedings of the 25rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (2019) | DOI
[2] Pattern search algorithms for mixed variable programming, SIAM J. Optim., Volume 11 (2001) no. 3, pp. 573-594 | DOI | MR | Zbl
[3] Mesh adaptive direct search algorithms for constrained optimization, SIAM J. Optim., Volume 17 (2004) no. 1, pp. 188-217 | DOI | MR | Zbl
[4] Order-based error for managing ensembles of surrogates in mesh adaptive direct search, Journal of Global Optimization, Volume 70 (2018) no. 3, pp. 645-675 | DOI | MR | Zbl
[5] An algorithmic framework for convex Mixed Integer Nonlinear Programs, Discrete Optimization, Volume 5 (2008) no. 2, pp. 186-204 | DOI | MR | Zbl
[6] Trust region methods, Society for Industrial and Applied Mathematics, 2000 | Zbl
[7] Geometry of interpolation sets in derivative free optimization, Math. Program., Volume 111 (2008) no. 1-2, pp. 141-172 | DOI | MR
[8] Global convergence of general derivative-free trust-region algorithms to first-and second-order critical points, SIAM J. Optim., Volume 20 (2009) no. 1, pp. 387-415 | DOI | MR | Zbl
[9] Efficient parameter estimation for information retrieval using black-box optimization, IEEE Transactions on Knowledge and Data Engineering, Volume 30 (2018) no. 7, pp. 1240-1253 | DOI
[10] RBFOpt: an open-source library for black-box optimization with costly function evaluations, Mathematical Programming Computation, Volume 10 (2018) no. 4, pp. 597-629 | DOI | MR | Zbl
[11] An effective algorithm for hyperparameter optimization of neural networks, IBM Journal of Research and Development, Volume 61 (2017) no. 4/5
[12] The global optimization problem: an introduction, Towards Global Optimization (L. C. W. Dixon; G. P. Szego, eds.), North-Holland, 1975, pp. 1-15 | Zbl
[13] Radial basis function interpolation, Ph. D. Thesis, Stellenbosch: Stellenbosch University (2008)
[14] Surrogate Optimization Toolbox (pySOT), 2015 (http://github.com/dme65/pysot)
[15] Scalable global optimization via local Bayesian optimization, Advances in Neural Information Processing Systems (2019), pp. 5496-5507
[16] A Radial Basis Function Method for Global Optimization, Journal of Global Optimization, Volume 19 (2001) no. 3, pp. 201-227 | DOI | MR
[17] scikit-optimize/scikit-optimize: v0.5.2, 2018 (Zenodo, https://www.doi.org/10.5281/zenodo.1207017) | DOI
[18] Sequential model-based optimization for general algorithm configuration, International Conference on Learning and Intelligent Optimization (2011), pp. 507-523 | DOI
[19] HyperNOMAD: Hyperparameter optimization of deep neural networks using mesh adaptive direct search (2019) (https://arxiv.org/abs/1907.01698)
[20] Algorithm 909: NOMAD: Nonlinear Optimization with the MADS algorithm, ACM Trans. Math. Softw., Volume 37 (2011) no. 4, p. 44:1-44:15 | DOI | MR | Zbl
[21] An algorithmic framework based on primitive directions and nonmonotone line searches for black-box optimization problems with integer variables, Mathematical Programming Computation, Volume 12 (2020) no. 4, pp. 673-702 | DOI | MR | Zbl
[22] MINLP Library 2 (http://www.gamsworld.org/minlp/minlplib2/html/)
[23] Benchmarking Derivative-Free Optimization Algorithms, SIAM J. Optim., Volume 20 (2009) no. 1, pp. 172-191 | DOI | MR | Zbl
[24] MISO: mixed-integer surrogate optimization framework, Optimization and Engineering (2015), pp. 1-27 (Online first) | DOI | Zbl
[25] SO-MI: A surrogate model algorithm for computationally expensive nonlinear mixed-integer black-box global optimization problems, Computers & Operations Research, Volume 40 (2013) no. 5, pp. 1383-1400 | DOI | MR | Zbl
[26] Neumaier’s collection of test problems for global optimization (http://www.mat.univie.ac.at/~neum/glopt/my_problems.html, retrieved in May 2014)
[27] Combinatorial Optimization: Algorithms and Complexity, Dover Publications, 1998 | Zbl
[28] Scikit-learn: Machine Learning in Python, J. Mach. Learn. Res., Volume 12 (2011), pp. 2825-2830 | MR | Zbl
[29] Five lectures on radial basis functions, Informatics and Mathematical Modelling, Technical University of Denmark, DTU (2005)
[30] Nevergrad - A gradient-free optimization platform, https://GitHub.com/FacebookResearch/Nevergrad, 2018
[31] An initialization strategy for high-dimensional surrogate-based expensive black-box optimization, Modeling and Optimization: Theory and Applications, Springer, 2013, pp. 51-85 | MR | Zbl
[32] Constrained optimization by radial basis function interpolation for high-dimensional expensive black-box problems with infeasible initial points, Engineering Optimization, Volume 46 (2014) no. 2, pp. 218-243 | DOI | MR
[33] A Stochastic Radial Basis Function Method for the Global Optimization of Expensive Functions, INFORMS Journal on Computing, Volume 19 (2007) no. 4, pp. 497-509 | DOI | MR | Zbl
[34] Large-scale Constrained Black-box Optimization: Theory, Methodology, and Applications, Ph. D. Thesis, Singapore University of Technology and Design (2017)
[35] Some experiments in machine learning using vector evaluated genetic algorithms, Ph. D. Thesis, Vanderbilt University (1984)
[36] A wide class of test functions for global optimization, Journal of Global Optimization, Volume 3 (1993) no. 2, pp. 133-137 | DOI | MR | Zbl
[37] Practical bayesian optimization of machine learning algorithms, Advances in neural information processing systems, Volume 25 (2012), pp. 2951-2959
[38] On the Implementation of a Primal-Dual Interior Point Filter Line Search Algorithm for Large-Scale Nonlinear Programming, Math. Program., Volume 106 (2006) no. 1, pp. 25-57 | DOI | Zbl
[39] Global Convergence of Radial Basis Function Trust-Region Algorithms for Derivative-Free Optimization, SIAM Rev., Volume 55 (2013) no. 2, pp. 349-371 | DOI | MR | Zbl
Cited by Sources: