Home  | Publications | BN26

Fast Rates for Nonstationary Weighted Risk Minimization

MCML Authors

Abstract

Weighted empirical risk minimization is a common approach to prediction under distribution drift. This article studies its out-of-sample prediction error under nonstationarity. We provide a general decomposition of the excess risk into a learning term and an error term associated with distribution drift, and prove oracle inequalities for the learning error under mixing conditions. The learning bound holds uniformly over arbitrary weight classes and accounts for the effective sample size induced by the weight vector, the complexity of the weight and hypothesis classes, and potential data dependence. We illustrate the applicability and sharpness of our results in (auto-) regression problems with linear models, basis approximations, and neural networks, recovering minimax-optimal rates (up to logarithmic factors) when specialized to unweighted and stationary settings.

misc BN26


Preprint

Feb. 2026

Authors

T. BrockT. Nagler

Links

arXiv

Research Area

 A1 | Statistical Foundations & Explainability

BibTeXKey: BN26

Back to Top