Home  | Publications | KSK+26

On the Interplay of Priors and Overparametrization in Bayesian Neural Network Posteriors

MCML Authors

Abstract

Bayesian neural network (BNN) posteriors are often considered impractical for inference, as symmetries fragment them, non-identifiabilities inflate dimensionality, and weight-space priors are seen as meaningless. In this work, we study how overparametrization and priors together reshape BNN posteriors and derive implications allowing us to better understand their interplay. We show that redundancy introduces three key phenomena that fundamentally reshape the posterior geometry: layer balancedness, weight distribution on equal-probability manifolds, and prior conformity. We validate our findings through extensive experiments with posterior sampling budgets that far exceed those of earlier works, and demonstrate how overparametrization induces structured, prior-aligned weight posterior distributions.

inproceedings KSK+26


AISTATS 2026

29th International Conference on Artificial Intelligence and Statistics. Tangier, Morocco, May 02-05, 2026. Spotlight Presentation. To be published.
Conference logo
A Conference

Authors

J. KobialkaE. Sommer • J. Kwon • D. Dold • D. Rügamer

Links

URL

Research Area

 A1 | Statistical Foundations & Explainability

BibTeXKey: KSK+26

Back to Top