Home  | Publications | BFD+26

The Price of Robustness: Stable Classifiers Need Overparameterization

MCML Authors

Abstract

The relationship between overparameterization, stability, and generalization remains incompletely understood in the setting of discontinuous classifiers. We address this gap by establishing a generalization bound for finite function classes that improves inversely with class stability, defined as the expected distance to the decision boundary in the input domain (margin). Interpreting class stability as a quantifiable notion of robustness, we derive as a corollary a law of robustness for classification that extends the results of Bubeck and Sellke beyond smoothness assumptions to discontinuous functions. In particular, any interpolating model with p≈n parameters on n data points must be unstable, implying that substantial overparameterization is necessary to achieve high stability. We obtain analogous results for parameterized infinite function classes by analyzing a stronger robustness measure derived from the margin in the codomain, which we refer to as the normalized co-stability. Experiments support our theory: stability increases with model size and correlates with test performance, while traditional norm-based measures remain largely uninformative.

inproceedings BFD+26


ICLR 2026

14th International Conference on Learning Representations. Rio de Janeiro, Brazil, Apr 23-27, 2026. To be published. Preprint available.
Conference logo
A* Conference

Authors

J. von BergA. FonoM. DatresS. MaskeyG. Kutyniok

Links

arXiv

Research Area

 A2 | Mathematical Foundations

BibTeXKey: BFD+26

Back to Top