Benchmarking the performance of Bayesian optimization across multiple experimental materials science domains
Abstract Bayesian optimization (BO) has been leveraged for guiding autonomous and high-throughput experiments in materials science. However, few have evaluated the efficiency of BO across a broad range of experimental materials domains. In this work, we quantify the performance of BO with a collecti...
Enregistré dans:
Auteurs principaux: | , , , , , , , , , , , , , , |
---|---|
Format: | article |
Langue: | EN |
Publié: |
Nature Portfolio
2021
|
Sujets: | |
Accès en ligne: | https://doaj.org/article/bfc6d1e9f14b4b8bbc093f32c3360b8c |
Tags: |
Ajouter un tag
Pas de tags, Soyez le premier à ajouter un tag!
|
Résumé: | Abstract Bayesian optimization (BO) has been leveraged for guiding autonomous and high-throughput experiments in materials science. However, few have evaluated the efficiency of BO across a broad range of experimental materials domains. In this work, we quantify the performance of BO with a collection of surrogate model and acquisition function pairs across five diverse experimental materials systems. By defining acceleration and enhancement metrics for materials optimization objectives, we find that surrogate models such as Gaussian Process (GP) with anisotropic kernels and Random Forest (RF) have comparable performance in BO, and both outperform the commonly used GP with isotropic kernels. GP with anisotropic kernels has demonstrated the most robustness, yet RF is a close alternative and warrants more consideration because it is free from distribution assumptions, has smaller time complexity, and requires less effort in initial hyperparameter selection. We also raise awareness about the benefits of using GP with anisotropic kernels in future materials optimization campaigns. |
---|