62 Downloads (Pure)

Abstract

Approximate Bayesian computation (ABC) is a popular likelihood-free inference method for models with intractable likelihood functions. As ABC methods usually rely on comparing summary statistics of observed and simulated data, the choice of the statistics is crucial. This choice involves a trade-off between loss of information and dimensionality reduction, and is often determined based on domain knowledge. However, handcrafting and selecting suitable statistics is a laborious task involving multiple trial-and-error steps. In this work, we introduce an active learning method for ABC statistics selection which reduces the domain expert’s work considerably. By involving the experts, we are able to handle misspecified models, unlike the existing dimension reduction methods. Moreover, empirical results show better posterior estimates than with existing methods, when the simulation budget is limited.
Original languageEnglish
Title of host publicationProceedings of the 39th International Conference on Machine Learning
PublisherJMLR
Pages1893-1905
Publication statusPublished - 2022
MoE publication typeA4 Conference publication
EventInternational Conference on Machine Learning - Baltimore, United States
Duration: 17 Jul 202223 Jul 2022
Conference number: 39

Publication series

NameProceedings of Machine Learning Research
PublisherPMLR
Volume162
ISSN (Electronic)2640-3498

Conference

ConferenceInternational Conference on Machine Learning
Abbreviated titleICML
Country/TerritoryUnited States
CityBaltimore
Period17/07/202223/07/2022

Fingerprint

Dive into the research topics of 'Approximate Bayesian Computation with Domain Expert in the Loop'. Together they form a unique fingerprint.

Cite this