Home  | Publications | KVI+25

Implicit Regularization for Tubal Tensor Factorizations via Gradient Descent

MCML Authors

Link to Profile Felix Krahmer

Felix Krahmer

Prof. Dr.

Principal Investigator

Abstract

We provide a rigorous analysis of implicit regularization in an overparametrized tensor factorization problem beyond the lazy training regime. For matrix factorization problems, this phenomenon has been studied in a number of works. A particular challenge has been to design universal initialization strategies which provably lead to implicit regularization in gradient-descent methods. At the same time, it has been argued by Cohen et. al. 2016 that more general classes of neural networks can be captured by considering tensor factorizations. However, in the tensor case, implicit regularization has only been rigorously established for gradient flow or in the lazy training regime. In this paper, we prove the first tensor result of its kind for gradient descent rather than gradient flow. We focus on the tubal tensor product and the associated notion of low tubal rank, encouraged by the relevance of this model for image data. We establish that gradient descent in an overparametrized tensor factorization model with a small random initialization exhibits an implicit bias towards solutions of low tubal rank. Our theoretical findings are illustrated in an extensive set of numerical simulations show-casing the dynamics predicted by our theory as well as the crucial role of using a small random initialization.

inproceedings


ICML 2025

42nd International Conference on Machine Learning. Vancouver, Canada, Jul 13-19, 2025.
Conference logo
A* Conference

Authors

S. Karnik • A. Veselovska • M. Iwen • F. Krahmer

Links

URL

Research Area

 A2 | Mathematical Foundations

BibTeXKey: KVI+25

Back to Top