Home  | Publications | NSB+24

Reshuffling Resampling Splits Can Improve Generalization of Hyperparameter Optimization

MCML Authors

Link to Profile Thomas Nagler

Thomas Nagler

Prof. Dr.

Principal Investigator

Link to Profile Bernd Bischl PI Matchmaking

Bernd Bischl

Prof. Dr.

Director

Matthias Feurer

Prof. Dr.

Thomas Bayes Fellow

* Former Thomas Bayes Fellow

Abstract

Hyperparameter optimization is crucial for obtaining peak performance of machine learning models. The standard protocol evaluates various hyperparameter configurations using a resampling estimate of the generalization error to guide optimization and select a final hyperparameter configuration. Without much evidence, paired resampling splits, i.e., either a fixed train-validation split or a fixed cross-validation scheme, are often recommended. We show that, surprisingly, reshuffling the splits for every configuration often improves the final model's generalization performance on unseen data. Our theoretical analysis explains how reshuffling affects the asymptotic behavior of the validation loss surface and provides a bound on the expected regret in the limiting regime. This bound connects the potential benefits of reshuffling to the signal and noise characteristics of the underlying optimization problem. We confirm our theoretical results in a controlled simulation study and demonstrate the practical usefulness of reshuffling in a large-scale, realistic hyperparameter optimization experiment. While reshuffling leads to test performances that are competitive with using fixed splits, it drastically improves results for a single train-validation holdout protocol and can often make holdout become competitive with standard CV while being computationally cheaper.

inproceedings


NeurIPS 2024

38th Conference on Neural Information Processing Systems. Vancouver, Canada, Dec 10-15, 2024.
Conference logo
A* Conference

Authors

T. NaglerL. SchneiderB. BischlM. Feurer

Links

URL GitHub

Research Area

 A1 | Statistical Foundations & Explainability

BibTeXKey: NSB+24

Back to Top