Home  | Publications | STJ+25

Learning With Exact Invariances in Polynomial Time

MCML Authors

Link to Profile Stefanie Jegelka PI Matchmaking

Stefanie Jegelka

Prof. Dr.

Principal Investigator

Abstract

We study the statistical-computational trade-offs for learning with exact invariances (or symmetries) using kernel regression. Traditional methods, such as data augmentation, group averaging, canonicalization, and frame-averaging, either fail to provide a polynomial-time solution or are not applicable in the kernel setting. However, with oracle access to the geometric properties of the input space, we propose a polynomial-time algorithm that learns a classifier with emph{exact} invariances. Moreover, our approach achieves the same excess population risk (or generalization error) as the original kernel regression problem. To the best of our knowledge, this is the first polynomial-time algorithm to achieve exact (not approximate) invariances in this context. Our proof leverages tools from differential geometry, spectral theory, and optimization. A key result in our development is a new reformulation of the problem of learning under invariances as optimizing an infinite number of linearly constrained convex quadratic programs, which may be of independent interest.

inproceedings


ICML 2025

42nd International Conference on Machine Learning. Vancouver, Canada, Jul 13-19, 2025.
Conference logo
A* Conference

Authors

A. Soleymani • B. Tahmasebi • S. Jegelka • P. Jaillet

Links

URL

Research Area

 A3 | Computational Models

BibTeXKey: STJ+25

Back to Top