Home  | Publications | BLD26

Efficient Learning of Stationary Diffusions With Stein-Type Discrepancies

MCML Authors

Link to Profile Mathias Drton PI Matchmaking

Mathias Drton

Prof. Dr.

Principal Investigator

Abstract

Learning a stationary diffusion amounts to estimating the parameters of a stochastic differential equation whose stationary distribution matches a target distribution. We build on the recently introduced kernel deviation from stationarity (KDS), which enforces stationarity by evaluating expectations of the diffusion's generator in a reproducing kernel Hilbert space. Leveraging the connection between KDS and Stein discrepancies, we introduce the Stein-type KDS (SKDS) as an alternative formulation. We prove that a vanishing SKDS guarantees alignment of the learned diffusion's stationary distribution with the target. Furthermore, under broad parametrizations, SKDS is convex with an empirical version that is ϵ-quasiconvex with high probability. Empirically, learning with SKDS attains comparable accuracy to KDS while substantially reducing computational cost and yields improvements over the majority of competitive baselines.

inproceedings BLD26


AISTATS 2026

29th International Conference on Artificial Intelligence and Statistics. Tangier, Morocco, May 02-05, 2026. To be published. Preprint available.
Conference logo
A Conference

Authors

F. Bleile • S. Lumpp • M. Drton

Links

arXiv

Research Area

 A1 | Statistical Foundations & Explainability

BibTeXKey: BLD26

Back to Top