On the implementation of a global optimization method for mixed-variable problems
Open Journal of Mathematical Optimization, Volume 2 (2021) , article no. 1, 25 p.

We describe the optimization algorithm implemented in the open-source derivative-free solver RBFOpt. The algorithm is based on the radial basis function method of Gutmann and the metric stochastic response surface method of Regis and Shoemaker. We propose several modifications aimed at generalizing and improving these two algorithms: (i) the use of an extended space to represent categorical variables in unary encoding; (ii) a refinement phase to locally improve a candidate solution; (iii) interpolation models without the unisolvence condition, to both help deal with categorical variables, and initiate the optimization before a uniquely determined model is possible; (iv) a master-worker framework to allow asynchronous objective function evaluations in parallel. Numerical experiments show the effectiveness of these ideas.

Received:
Revised:
Accepted:
Published online:
DOI: https://doi.org/10.5802/ojmo.3
Keywords: Derivative-free optimization, black-box optimization, mixed-variable problems
@article{OJMO_2021__2__A1_0,
     author = {Giacomo Nannicini},
     title = {On the implementation of a global optimization method for mixed-variable problems},
     journal = {Open Journal of Mathematical Optimization},
     eid = {1},
     publisher = {Universit\'e de Montpellier},
     volume = {2},
     year = {2021},
     doi = {10.5802/ojmo.3},
     language = {en},
     url = {https://ojmo.centre-mersenne.org/item/OJMO_2021__2__A1_0/}
}
Giacomo Nannicini. On the implementation of a global optimization method for mixed-variable problems. Open Journal of Mathematical Optimization, Volume 2 (2021) , article  no. 1, 25 p. doi : 10.5802/ojmo.3. https://ojmo.centre-mersenne.org/item/OJMO_2021__2__A1_0/

[1] Takuya Akiba; Shotaro Sano; Toshihiko Yanase; Takeru Ohta; Masanori Koyama Optuna: A Next-generation Hyperparameter Optimization Framework, Proceedings of the 25rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (2019)

[2] Charles Audet; John E. Dennis Jr Pattern search algorithms for mixed variable programming, SIAM J. Optim., Volume 11 (2001) no. 3, pp. 573-594 | Zbl 1035.90048

[3] Charles Audet; John E. Dennis Jr. Mesh adaptive direct search algorithms for constrained optimization, SIAM J. Optim., Volume 17 (2004) no. 1, pp. 188-217

[4] Charles Audet; Michael Kokkolaras; Sébastien Le Digabel; Bastien Talgorn Order-based error for managing ensembles of surrogates in mesh adaptive direct search, Journal of Global Optimization, Volume 70 (2018) no. 3, pp. 645-675

[5] P. Bonami; L. T. Biegler; A. R. Conn; G. Cornuéjols; I. E. Grossmann; C. D. Laird; J. Lee; A. Lodi; F. Margot; N. Sawaya; A. Wächter An algorithmic framework for convex Mixed Integer Nonlinear Programs, Discrete Optimization, Volume 5 (2008) no. 2, pp. 186-204

[6] Andrew R. Conn; Nicholas I. M. Gould; Philippe L. Toint Trust region methods, Society for Industrial and Applied Mathematics, 2000

[7] Andrew R. Conn; Katya Scheinberg; Luís N. Vicente Geometry of interpolation sets in derivative free optimization, Math. Program., Volume 111 (2008) no. 1-2, pp. 141-172

[8] Andrew R. Conn; Katya Scheinberg; Luís N. Vicente Global convergence of general derivative-free trust-region algorithms to first-and second-order critical points, SIAM J. Optim., Volume 20 (2009) no. 1, pp. 387-415

[9] Alberto Costa; Emanuele Di Buccio; Massimo Melucci; Giacomo Nannicini Efficient parameter estimation for information retrieval using black-box optimization, IEEE Transactions on Knowledge and Data Engineering, Volume 30 (2018) no. 7, pp. 1240-1253

[10] Alberto Costa; Giacomo Nannicini RBFOpt: an open-source library for black-box optimization with costly function evaluations, Mathematical Programming Computation, Volume 10 (2018) no. 4, pp. 597-629 | Article

[11] Gonzalo I. Diaz; Achille Fokoue; Giacomo Nannicini; Horst Samulowitz An effective algorithm for hyperparameter optimization of neural networks, IBM Journal of Research and Development, Volume 61 (2017) no. 4/5

[12] L. C. W. Dixon; G. P. Szego The global optimization problem: an introduction, Towards Global Optimization (L. C. W. Dixon; G. P. Szego, eds.), North-Holland, 1975, pp. 1-15

[13] Wilna Du Toit Radial basis function interpolation (2008) (Ph. D. Thesis)

[14] David Eriksson; David Bindel; Christine A. Shoemaker Surrogate Optimization Toolbox (pySOT), 2015 (http://github.com/dme65/pysot)

[15] David Eriksson; Michael Pearce; Jacob Gardner; Ryan D Turner; Matthias Poloczek Scalable global optimization via local Bayesian optimization, Advances in Neural Information Processing Systems (2019), pp. 5496-5507

[16] Hans-Martin Gutmann A Radial Basis Function Method for Global Optimization, Journal of Global Optimization, Volume 19 (2001) no. 3, pp. 201-227 | Article

[17] Tim Head; MechCoder; Gilles Louppe; Iaroslav Shcherbatyi; fcharras; Zé Vinícius; cmmalone; Christopher Schröder; nel215; Nuno Campos; Todd Young; Stefano Cereda; Thomas Fan; rene rex; Kejia (KJ) Shi; Justus Schwabedal; carlosdanielcsantos; Hvass-Labs; Mikhail Pak; SoManyUsernamesTaken; Fred Callaway; Loïc Estève; Lilian Besson; Mehdi Cherti; Karlson Pfannschmidt; Fabian Linzberger; Christophe Cauet; Anna Gut; Andreas Mueller; Alexander Fabisch scikit-optimize/scikit-optimize: v0.5.2, 2018 (Zenodo, https://www.doi.org/10.5281/zenodo.1207017) | Article

[18] Frank Hutter; Holger H Hoos; Kevin Leyton-Brown Sequential model-based optimization for general algorithm configuration, International Conference on Learning and Intelligent Optimization (2011), pp. 507-523

[19] Dounia Lakhmiri; Sébastien Le Digabel; Christophe Tribes HyperNOMAD: Hyperparameter optimization of deep neural networks using mesh adaptive direct search (2019) (https://arxiv.org/abs/1907.01698)

[20] S. Le Digabel Algorithm 909: NOMAD: Nonlinear Optimization with the MADS algorithm, ACM Trans. Math. Softw., Volume 37 (2011) no. 4, p. 44:1-44:15 | Article

[21] Giampaolo Liuzzi; Stefano Lucidi; Francesco Rinaldi An algorithmic framework based on primitive directions and nonmonotone line searches for black-box optimization problems with integer variables, Mathematical Programming Computation, Volume 12 (2020) no. 4, pp. 673-702

[22] MINLP Library 2 (http://www.gamsworld.org/minlp/minlplib2/html/)

[23] Jorge Moré; Stefan M. Wild Benchmarking Derivative-Free Optimization Algorithms, SIAM J. Optim., Volume 20 (2009) no. 1, pp. 172-191

[24] Juliane Müller MISO: mixed-integer surrogate optimization framework, Optimization and Engineering (2015), pp. 1-27 (Online first) | Article

[25] Juliane Müller; Christine A. Shoemaker; Robert Piché SO-MI: A surrogate model algorithm for computationally expensive nonlinear mixed-integer black-box global optimization problems, Computers & Operations Research, Volume 40 (2013) no. 5, pp. 1383-1400 | Article

[26] Arnold Neumaier Neumaier’s collection of test problems for global optimization (http://www.mat.univie.ac.at/~neum/glopt/my_problems.html, retrieved in May 2014)

[27] C. H. Papadimitriou; K. Steiglitz Combinatorial Optimization: Algorithms and Complexity, Dover Publications, 1998

[28] F. Pedregosa; G. Varoquaux; A. Gramfort; V. Michel; B. Thirion; O. Grisel; M. Blondel; P. Prettenhofer; R. Weiss; V. Dubourg; J. Vanderplas; A. Passos; D. Cournapeau; M. Brucher; M. Perrot; E. Duchesnay Scikit-learn: Machine Learning in Python, J. Mach. Learn. Res., Volume 12 (2011), pp. 2825-2830

[29] Mike J. D. Powell Five lectures on radial basis functions, Informatics and Mathematical Modelling, Technical University of Denmark, DTU (2005)

[30] J. Rapin; O. Teytaud Nevergrad - A gradient-free optimization platform, https://GitHub.com/FacebookResearch/Nevergrad, 2018

[31] Rommel G. Regis An initialization strategy for high-dimensional surrogate-based expensive black-box optimization, Modeling and Optimization: Theory and Applications, Springer, 2013, pp. 51-85

[32] Rommel G. Regis Constrained optimization by radial basis function interpolation for high-dimensional expensive black-box problems with infeasible initial points, Engineering Optimization, Volume 46 (2014) no. 2, pp. 218-243

[33] Rommel G. Regis; Christine A. Shoemaker A Stochastic Radial Basis Function Method for the Global Optimization of Expensive Functions, INFORMS Journal on Computing, Volume 19 (2007) no. 4, pp. 497-509 | Article

[34] Giorgio Sartor Large-scale Constrained Black-box Optimization: Theory, Methodology, and Applications (2017) (Ph. D. Thesis)

[35] James David Schaffer Some experiments in machine learning using vector evaluated genetic algorithms (1984) (Ph. D. Thesis)

[36] Fabio Schoen A wide class of test functions for global optimization, Journal of Global Optimization, Volume 3 (1993) no. 2, pp. 133-137

[37] Jasper Snoek; Hugo Larochelle; Ryan P Adams Practical bayesian optimization of machine learning algorithms, Advances in neural information processing systems, Volume 25 (2012), pp. 2951-2959

[38] A. Wächter; L. T. Biegler On the Implementation of a Primal-Dual Interior Point Filter Line Search Algorithm for Large-Scale Nonlinear Programming, Math. Program., Volume 106 (2006) no. 1, pp. 25-57

[39] Stefan M. Wild; Christine A. Shoemaker Global Convergence of Radial Basis Function Trust-Region Algorithms for Derivative-Free Optimization, SIAM Rev., Volume 55 (2013) no. 2, pp. 349-371