Home  | Publications | CMS23

How to Induce Regularization in Linear Models: A Guide to Reparametrizing Gradient Flow

MCML Authors

Abstract

In this work, we analyze the relation between reparametrizations of gradient flow and the induced implicit bias in linear models, which encompass various basic regression tasks. In particular, we aim at understanding the influence of the model parameters - reparametrization, loss, and link function - on the convergence behavior of gradient flow. Our results provide conditions under which the implicit bias can be well-described and convergence of the flow is guaranteed. We furthermore show how to use these insights for designing reparametrization functions that lead to specific implicit biases which are closely connected to ℓp- or trigonometric regularizers.

misc


Preprint

Aug. 2023

Authors

H.-H. Chou • J. Maly • D. Stöger

Links


Research Area

 A2 | Mathematical Foundations

BibTeXKey: CMS23

Back to Top