Home  | Publications | MHR+25a

Imprecise Acquisitions in Bayesian Optimization

MCML Authors

Abstract

Gaussian processes (GPs) are widely used as surrogate models in Bayesian optimization (BO). However, their predictive performance is highly sensitive to the choice of hyperparameters, often leading to markedly different posterior predic-<br>tions. Hierarchical BO addresses this issue by marginalizing over hyperparameters to produce an aggregated posterior, which is then evaluated using an acquisition function (AF). Yet, this aggregation can obscure the disagreement among individual GP posteriors, an informative source of uncertainty that could be exploited for more robust decision-making. To overcome this limitation, we propose Imprecise Acquisitions in Bayesian Optimization (IABO), which maintains a set of GP models and evaluates the AF separately under each one. This results in an imprecise, set-valued AF whose spread naturally captures model disagreement. We investigate two aggregation strategies applied at different stages: (i) acquisition-level aggregation, where AF values are combined into a single scalar via an aggregated AF, and (ii) decision-level aggregation, where each AF is optimized independently and the resulting maximizers are compared using stochastic dominance criteria. Our approach is applicable to arbitrary AFs, and experiments show that our decision-level strategies are highly competitive, often outperforming standard BO baselines across a range of benchmarks problems.

inproceedings MHR+25a


EIML @EurIPS 2025

Workshop on Epistemic Intelligence in Machine Learning at the European Conference on Information Processing Systems. Copenhagen, Denmark, Dec 03-05, 2025.

Authors

V. MargrafJ. Hanselle • J. Rodemann • M. Wever • S. Vollmer • E. Hüllermeier

Links

PDF

In Collaboration

partnerlogo

Research Area

 A3 | Computational Models

BibTeXKey: MHR+25a

Back to Top