High-dimensional bayesian optimization with foundational assumptions
Title: High-dimensional bayesian optimization with foundational assumptions
DNr: NAISS 2024/22-1657
Project Type: NAISS Small Compute
Principal Investigator: Carl Hvarfner <carl.hvarfner@cs.lth.se>
Affiliation: Lunds universitet
Duration: 2024-12-13 – 2026-01-01
Classification: 10106
Homepage: https://hvarfner.github.io/
Keywords:

Abstract

High-dimensional problems have long been considered the Achilles' heel of Bayesian optimization (BO) algorithms. The well-documented curse of dimensionality, or boundary issue, has spawned a collection of algorithms which work around the issue, through assumptions of either effective low-dimensional subspaces or additivity, or resorting to localized BO variants. In this paper, We make the distinction between dimensionality and complexity in BO problems, and show that as long as the problem is presumed to be of reasonable and constant complexity, as measured by the RKHS norm on the space of black-box functions, increasing the dimensionality does not substantially degrade performance. Our findings are complemented by state-of-the-art results on problems with both dense and sparse effective subspaces, as well as and real-world high-dimensional optimization problems.