Home  | Publications | PBR25

Revisiting Unbiased Implicit Variational Inference

MCML Authors

Abstract

Recent years have witnessed growing interest in semi-implicit variational inference (SIVI) methods due to their ability to rapidly generate samples from highly complicated distributions. However, since the likelihood of these samples is non-trivial to estimate in high dimensions, current research focuses on finding effective SIVI training routines. While unbiased implicit variational inference (UIVI) has largely been dismissed as imprecise and computationally prohibitive because of its inner MCMC loop, we revisit this method and identify key shortcomings. In particular, we show that UIVI's MCMC loop can be effectively replaced via importance sampling and the optimal proposal distribution can be learned stably by minimizing an expected forward Kullback–Leibler divergence without bias. Our refined approach demonstrates superior performance or parity with state-of-the-art methods on established SIVI benchmarks.

inproceedings


ICML 2025

42nd International Conference on Machine Learning. Vancouver, Canada, Jul 13-19, 2025.
Conference logo
A* Conference

Authors

T. PielokB. BischlD. Rügamer

Links

URL

Research Area

 A1 | Statistical Foundations & Explainability

BibTeXKey: PBR25

Back to Top