10
Jul
Colloquium
Variational Learning for Large Deep Networks
Thomas Möllenhoff, RIKEN, Tokyo
10.07.2024
3:15 pm - 4:45 pm
LMU Department of Statistics and via zoom
Thomas Möllenhoff presents extensive evidence against the common belief that variational Bayesian learning is ineffective for large neural networks.
First, he shows that a recent deep learning method called sharpness-aware minimization (SAM) solves an optimal convex relaxation of the variational Bayesian objective.
Then, he demonstrates that a direct optimization of the variational objective with an Improved Variational Online Newton method (IVON) can consistently match or outperforms Adam for training large networks such as GPT-2 and ResNets from scratch. IVON’s computational costs are nearly identical to Adam but its predictive uncertainty is better.
He shows several new use cases of variational learning where he improves fine-tuning and model merging in Large Language Models, accurately predict generalization error, and faithfully estimate sensitivity to data.
Organized by:
Department of Statistics LMU Munich
Related
Colloquium • 05.02.2025 • LMU Department of Statistics and via zoom
TBA
Colloquium at the LMU Department of Statistics with Isabel Valera (Saarland University in Saarbrücken).
Colloquium • 29.01.2025 • LMU Department of Statistics and via zoom
TBA
Colloquium at the LMU Department of Statistics with Sophie Langer (University of Twente).
Colloquium • 15.01.2025 • LMU Department of Statistics and via zoom
TBA
Colloquium at the LMU Department of Statistics with Sonja Greven (HU Berlin).
Colloquium • 11.12.2024 • LMU Department of Statistics and via zoom
TBA
Colloquium at the LMU Department of Statistics with Stijn Vansteelandt (Ghent University).
Munich AI Lectures • 25.11.2024 • Große Aula der LMU Geschwister-Scholl-Platz 1, Room 120 80539 München
The Mathematical Universe behind Deep Neural Networks
Join us on Nov 25 for Prof. Helmut Bölcskei’s lecture on the mathematical foundations driving deep neural networks, hosted by Bavarian AI at LMU.