Home  | Publications | BFD+25

The Price of Robustness: Stable Classifiers Need Overparameterization

MCML Authors

Abstract

In this work, we show that class stability, the expected distance of an input to the decision boundary, captures what classical capacity measures, such as weight norms, fail to explain. We prove a generalization bound that improves inversely with the class stability, interpreted as a quantifiable notion of robustness. As a corollary, we derive a law of robustness for classification: any interpolating model with parameters must be unstable, so high stability requires significant overparameterization. Crucially, our results extend beyond smoothness assumptions and apply to discontinuous classifiers. Preliminary experiments support our theory: empirical stability increases with model size, while norm-based measures remain uninformative.

inproceedings


HiLD @ICML 2025

Workshop on High-dimensional Learning Dynamics at the 42nd International Conference on Machine Learning. Vancouver, Canada, Jul 13-19, 2025.

Authors

J. von BergA. Fono • M. Datres • S. MaskeyG. Kutyniok

Links

URL

Research Area

 A2 | Mathematical Foundations

BibTeXKey: BFD+25

Back to Top