Home  | Publications | Sch25

Advancing Hyperparameter Optimization: Foundations, Multiple Objectives and Algorithmic Innovations Informed Through Benchmarking

MCML Authors

Abstract

Hyperparameter optimization (HPO) is a fundamental aspect of machine learning (ML), directly influencing model performance and adaptability. As a computationally expensive black-box optimization problem, HPO requires efficient algorithms to identify optimal hyperparameter configurations. This thesis advances the field of HPO along three key dimensions: foundational insights, HPO in the presence of more than one objective, and algorithmic innovations through benchmarking. (Shortened.)

phdthesis


Dissertation

LMU München. May. 2025

Authors

L. Schneider

Links

DOI

Research Area

 A1 | Statistical Foundations & Explainability

BibTeXKey: Sch25

Back to Top