Home  | Publications | PBR23

Approximate Bayesian Inference With Stein Functional Variational Gradient Descent

MCML Authors

Abstract

We propose a general-purpose variational algorithm that forms a natural analogue of Stein variational gradient descent (SVGD) in function space. While SVGD successively updates a set of particles to match a target density, the method introduced here of Stein functional variational gradient descent (SFVGD) updates a set of particle functions to match a target stochastic process (SP). The update step is found by minimizing the functional derivative of the Kullback-Leibler divergence between SPs. SFVGD can either be used to train Bayesian neural networks (BNNs) or for ensemble gradient boosting. We show the efficacy of training BNNs with SFVGD on various real-world datasets.

inproceedings


ICLR 2023

11th International Conference on Learning Representations. Kigali, Rwanda, May 01-05, 2023.
Conference logo
A* Conference

Authors

T. PielokB. BischlD. Rügamer

Links

URL

Research Area

 A1 | Statistical Foundations & Explainability

BibTeXKey: PBR23

Back to Top