# arxivststuff from arxiv that you should probably bookmark

## Bayesian optimization with virtual derivative sign observations

Abstract · Apr 4, 2017 11:40 ·

stat-ml stat-co stat-me

### Arxiv Abstract

• Eero Siivola
• Aki Vehtari
• Jarno Vanhatalo
• Javier González

Bayesian optimization (BO) is a global optimization strategy designed to find the minimum of expensive black-box functions $g$ typically defined on a continuous sets of $\mathcal{R}^d$. Using a Gaussian process (GP) as a surrogate model for the objective and an acquisition function to systematically search its domain, BO strategies aim to minimize the amount of samples required to find the minimum of $g$. Although currently available acquisition functions address this goal with different degree of success, an over-exploration effect of the contour of $g$ is typically observed. This is due to the myopic nature of most acquisitions that greedily try to over-reduce uncertainty in the border of the search domain. In most real problems, however, like the configuration of machine learning algorithms, the function domain is conservatively large and with a high probability the global minimum is not at the boundary. We propose a method to incorporate this knowledge into the searching process by adding virtual derivative observations at the borders of the search space. We use the properties of GP models that allow us to easily impose conditions on the partial derivatives of the objective. The method is applicable with any acquisition function, it is easy to use and consistently reduces the number of evaluations required to find the minimum of $g$ irrespective of the acquisition used. We illustrate the benefits our approach in a simulation study with a battery of objective functions.