13
Feb

©jittawit.21 - stock.adobe.com
AI Keynote Series
Simplifying Debiased Inference via Automatic Differentiation and Probabilistic Programming
Alex Luedtke, Department of Statistics, University of Washington
13.02.2025
10:00 am - 11:30 am
Online via Zoom
The speaker would introduce an algorithm that simplifies the construction of efficient estimators, making them accessible to a broader audience. 'Dimple' takes as input computer code representing a parameter of interest and outputs an efficient estimator. Unlike standard approaches, it does not require users to derive a functional derivative known as the efficient influence function. Dimple avoids this task by applying automatic differentiation to the statistical functional of interest. Doing so requires expressing this functional as a composition of primitives satisfying a novel differentiability condition. Dimple also uses this composition to determine the nuisances it must estimate. In software, primitives can be implemented independently of one another and reused across different estimation problems. The speaker provides a proof-of-concept Python implementation and showcase through examples how it allows users to go from parameter specification to efficient estimation with just a few lines of code.
Organized by:
Institute of AI in Management LMU Munich
Related

Colloquium • 25.06.2025 • LMU Department of Statistics and via zoom
Practical Causal Reasoning as a Means for Ethical ML
25.06.25, 4:15-5:45 pm: Isabel Valera, Uni Saarbrücken explores fairness in ML and introduces DeCaFlow, a causal model for counterfactuals.

Colloquium • 11.06.2025 • LMU Department of Statistics and via zoom
Veridical Data Science and PCS Uncertainty Quantification
11.06.25, 4:15-5:45 pm: Bin Yu, UC Berkeley on how PCS improves AI reliability by tackling hidden uncertainty in data science decisions.