Home  | Publications | KMB+23a

Smoothing the Edges: Smooth Optimization for Sparse Regularization Using Hadamard Overparametrization

MCML Authors

Link to Profile Christian Müller

Christian Müller

Prof. Dr.

Principal Investigator

Link to Profile Bernd Bischl PI Matchmaking

Bernd Bischl

Prof. Dr.

Director

Link to Profile David Rügamer PI Matchmaking

David Rügamer

Prof. Dr.

Principal Investigator

Abstract

We present a framework for smooth optimization of explicitly regularized objectives for (structured) sparsity. These non-smooth and possibly non-convex problems typically rely on solvers tailored to specific models and regularizers. In contrast, our method enables fully differentiable and approximation-free optimization and is thus compatible with the ubiquitous gradient descent paradigm in deep learning. The proposed optimization transfer comprises an overparameterization of selected parameters and a change of penalties. In the overparametrized problem, smooth surrogate regularization induces non-smooth, sparse regularization in the base parametrization. We prove that the surrogate objective is equivalent in the sense that it not only has identical global minima but also matching local minima, thereby avoiding the introduction of spurious solutions. Additionally, our theory establishes results of independent interest regarding matching local minima for arbitrary, potentially unregularized, objectives. We comprehensively review sparsity-inducing parametrizations across different fields that are covered by our general theory, extend their scope, and propose improvements in several aspects. Numerical experiments further demonstrate the correctness and effectiveness of our approach on several sparse learning problems ranging from high-dimensional regression to sparse neural network training.

misc


Preprint

Jul. 2023

Authors

C. KolbC. L. MüllerB. BischlD. Rügamer

Links


Research Areas

 A1 | Statistical Foundations & Explainability

 C2 | Biology

BibTeXKey: KMB+23a

Back to Top