Home  | Publications | RSB+25

Efficiently Warmstarting MCMC for BNNs

MCML Authors

Link to Profile Bernd Bischl PI Matchmaking

Bernd Bischl

Prof. Dr.

Director

Link to Profile David Rügamer PI Matchmaking

David Rügamer

Prof. Dr.

Principal Investigator

Matthias Feurer

Prof. Dr.

Thomas Bayes Fellow

* Former Thomas Bayes Fellow

Abstract

Markov Chain Monte Carlo (MCMC) algorithms are widely regarded as the gold standard for approximate inference in Bayesian neural networks (BNNs). However, they remain computationally expensive and prone to inefficiencies, such<br>as dying samplers, frequently leading to substantial waste of computational resources. While prior work has presented warmstarting techniques as an effective method to mitigate these inefficiencies, we provide a more comprehensive empirical analysis of how initializations of samplers affect their behavior. Based on various experiments examining the dynamics of warmstarting MCMC, we propose novel warmstarting strategies that leverage performance predictors and adaptive termination criteria to achieve better-performing, yet more cost-efficient, models. In numerical experiments, we demonstrate that this approach provides a practical pathway to more resource-efficient approximate inference in BNNs.

inproceedings


FPI @ICLR 2025

Workshop on Frontiers in Probabilistic Inference: Learning meets Sampling at the 13th International Conference on Learning Representations. Singapore, Apr 24-28, 2025.

Authors

D. RundelE. SommerB. BischlD. RügamerM. Feurer

Links

URL

Research Area

 A1 | Statistical Foundations & Explainability

BibTeXKey: RSB+25

Back to Top