Abstract

Contextual Bayesian Optimization (CBO) efficiently optimizes black-box functions with respect to design variables, while simultaneously integrating contextual information regarding the environment, such as experimental conditions. However, the relevance of contextual variables is not necessarily known beforehand. Moreover, contextual variables can sometimes be optimized themselves at an additional cost, a setting overlooked by current CBO algorithms. Cost-sensitive CBO would simply include optimizable contextual variables as part of the design variables based on their cost. Instead, we adaptively select a subset of contextual variables to include in the optimization, based on the trade-off between their relevance and the additional cost incurred by optimizing them compared to leaving them to be determined by the environment. We learn the relevance of contextual variables by sensitivity analysis of the posterior surrogate model while minimizing the cost of optimization by leveraging recent developments on early stopping for BO. We empirically evaluate our proposed Sensitivity-Analysis-Driven Contextual BO (SADCBO) method against alternatives on both synthetic and real-world experiments, together with extensive ablation studies, and demonstrate a consistent improvement across examples.

Original languageEnglish
Title of host publicationProceedings of the Fortieth Conference on Uncertainty in Artificial Intelligence
PublisherJMLR
Pages2450-2470
Volume244
Publication statusPublished - 2024
MoE publication typeA4 Conference publication
EventConference on Uncertainty in Artificial Intelligence - Barcelona, Spain
Duration: 15 Jul 202419 Jul 2024
Conference number: 40

Publication series

NameProceedings of Machine Learning Research
Volume244
ISSN (Electronic)2640-3498

Conference

ConferenceConference on Uncertainty in Artificial Intelligence
Abbreviated titleUAI
Country/TerritorySpain
CityBarcelona
Period15/07/202419/07/2024

Fingerprint

Dive into the research topics of 'Learning relevant contextual variables within Bayesian optimization'. Together they form a unique fingerprint.

Cite this