Benchmarking the performance of Bayesian optimization across multiple experimental materials science domains

Qiaohao Liang*, Aldair E. Gongora, Zekun Ren, Armi Tiihonen, Zhe Liu, Shijing Sun, James R. Deneault, Daniil Bash, Flore Mekki-Berrada, Saif A. Khan, Kedar Hippalgaonkar, Benji Maruyama, Keith A. Brown, John Fisher, Tonio Buonassisi

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

92 Citations (Scopus)

Abstract

Bayesian optimization (BO) has been leveraged for guiding autonomous and high-throughput experiments in materials science. However, few have evaluated the efficiency of BO across a broad range of experimental materials domains. In this work, we quantify the performance of BO with a collection of surrogate model and acquisition function pairs across five diverse experimental materials systems. By defining acceleration and enhancement metrics for materials optimization objectives, we find that surrogate models such as Gaussian Process (GP) with anisotropic kernels and Random Forest (RF) have comparable performance in BO, and both outperform the commonly used GP with isotropic kernels. GP with anisotropic kernels has demonstrated the most robustness, yet RF is a close alternative and warrants more consideration because it is free from distribution assumptions, has smaller time complexity, and requires less effort in initial hyperparameter selection. We also raise awareness about the benefits of using GP with anisotropic kernels in future materials optimization campaigns.

Original languageEnglish
Article number188
Journalnpj Computational Materials
Volume7
Issue number1
DOIs
Publication statusPublished - 18 Nov 2021
MoE publication typeA1 Journal article-refereed

Fingerprint

Dive into the research topics of 'Benchmarking the performance of Bayesian optimization across multiple experimental materials science domains'. Together they form a unique fingerprint.

Cite this