*Result*: Model-and-search: a derivative-free local optimization algorithm.
*Further Information*
*In this work, we propose Model-and-Search (MAS), a novel local-search derivative-free optimization algorithm, and show that it is convergent to a Karush-Kuhn-Tucker point. MAS aims to optimize a deterministic function over a box-bounded domain and is designed to work well within a confined budget of function evaluations. In MAS, the search is oriented to improve the value of the incumbent by combining a set of techniques, including gradient estimation and quadratic model building and optimization. We propose a novel sensitivity-based approach to construct an incomplete quadratic model when points are not enough to build a complete quadratic surrogate model of the true function. The surrogate model is then used to guide the search. We present extensive computational results on a collection of 501 publicly available test problems with varying dimensions and complexity. The computational results demonstrate that MAS performs well regardless of problem convexity and smoothness. [ABSTRACT FROM AUTHOR]
Copyright of Computational Optimization & Applications is the property of Springer Nature and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)*