Home | Research | Groups | Felix Krahmer

Research Group Felix Krahmer

Link to Felix Krahmer

Felix Krahmer

Prof. Dr.

Optimization & Data Analysis

A2 | Mathematical Foundations

Felix Krahmer

is Assistant Professor of Optimization & Data Analysis at TU Munich.

His research focuses on the mathematical foundations of signal and image processing. In particular, his research agenda covers randomized sensing methods, especially in compressed sensing, dimension reduction, and analog to digital conversion. His research involves not only the theoretical analysis of such methods but also their application, such as in a project on the non-destructive testing of steel pipes.

Team members @MCML

Link to Hung-Hsu Chou

Hung-Hsu Chou

Dr.

Optimization & Data Analysis

A2 | Mathematical Foundations

Link to Apostolos Evangelidis

Apostolos Evangelidis

Optimization & Data Analysis

A2 | Mathematical Foundations

Link to Hannah Laus

Hannah Laus

Optimization & Data Analysis

A2 | Mathematical Foundations

Link to Hanna Veselovska

Hanna Veselovska

Dr.

Optimization & Data Analysis

A2 | Mathematical Foundations

Publications @MCML

[12]
F. Hoppe, C. M. Verdun, H. Laus, F. Krahmer and H. Rauhut.
Non-Asymptotic Uncertainty Quantification in High-Dimensional Learning.
38th Conference on Neural Information Processing Systems (NeurIPS 2024). Vancouver, Canada, Dec 10-15, 2024. To be published. Preprint at arXiv. arXiv.
Abstract

Uncertainty quantification (UQ) is a crucial but challenging task in many high-dimensional regression or learning problems to increase the confidence of a given predictor. We develop a new data-driven approach for UQ in regression that applies both to classical regression approaches such as the LASSO as well as to neural networks. One of the most notable UQ techniques is the debiased LASSO, which modifies the LASSO to allow for the construction of asymptotic confidence intervals by decomposing the estimation error into a Gaussian and an asymptotically vanishing bias component. However, in real-world problems with finite-dimensional data, the bias term is often too significant to be neglected, resulting in overly narrow confidence intervals. Our work rigorously addresses this issue and derives a data-driven adjustment that corrects the confidence intervals for a large class of predictors by estimating the means and variances of the bias terms from training data, exploiting high-dimensional concentration phenomena. This gives rise to non-asymptotic confidence intervals, which can help avoid overestimating uncertainty in critical applications such as MRI diagnosis. Importantly, our analysis extends beyond sparse regression to data-driven predictors like neural networks, enhancing the reliability of model-based deep learning. Our findings bridge the gap between established theory and the practical applicability of such debiased methods.

MCML Authors
Link to Claudio Mayrink Verdun

Claudio Mayrink Verdun

Dr.

* Former member

A2 | Mathematical Foundations

Link to Hannah Laus

Hannah Laus

Optimization & Data Analysis

A2 | Mathematical Foundations

Link to Felix Krahmer

Felix Krahmer

Prof. Dr.

Optimization & Data Analysis

A2 | Mathematical Foundations

Link to Holger Rauhut

Holger Rauhut

Prof. Dr.

Mathematical Data Science and Artificial Intelligence

A2 | Mathematical Foundations


[11]
F. Hoppe, C. M. Verdun, H. Laus, S. Endt, M. I. Menzel, F. Krahmer and H. Rauhut.
Imaging with Confidence: Uncertainty Quantification for High-dimensional Undersampled MR Images.
18th European Conference on Computer Vision (ECCV 2024). Milano, Italy, Sep 29-Oct 04, 2024. To be published.
MCML Authors
Link to Claudio Mayrink Verdun

Claudio Mayrink Verdun

Dr.

* Former member

A2 | Mathematical Foundations

Link to Hannah Laus

Hannah Laus

Optimization & Data Analysis

A2 | Mathematical Foundations

Link to Felix Krahmer

Felix Krahmer

Prof. Dr.

Optimization & Data Analysis

A2 | Mathematical Foundations

Link to Holger Rauhut

Holger Rauhut

Prof. Dr.

Mathematical Data Science and Artificial Intelligence

A2 | Mathematical Foundations


[10]
C. M. Verdun, O. Melnyk, F. Krahmer and P. Jung.
Fast, blind, and accurate: Tuning-free sparse regression with global linear convergence.
37th Annual Conference on Learning Theory (COLT 2024). Edmonton, Canada, Jun 30-Jul 03, 2024. URL.
MCML Authors
Link to Claudio Mayrink Verdun

Claudio Mayrink Verdun

Dr.

* Former member

A2 | Mathematical Foundations

Link to Felix Krahmer

Felix Krahmer

Prof. Dr.

Optimization & Data Analysis

A2 | Mathematical Foundations


[9]
F. Hoppe, C. M. Verdun, H. Laus, F. Krahmer and H. Rauhut.
Uncertainty Quantification For Learned ISTA.
IEEE Workshop on Machine Learning for Signal Processing (MLSP 2023). Rome, Italy, Sep 17-20, 2023. DOI.
MCML Authors
Link to Claudio Mayrink Verdun

Claudio Mayrink Verdun

Dr.

* Former member

A2 | Mathematical Foundations

Link to Hannah Laus

Hannah Laus

Optimization & Data Analysis

A2 | Mathematical Foundations

Link to Felix Krahmer

Felix Krahmer

Prof. Dr.

Optimization & Data Analysis

A2 | Mathematical Foundations

Link to Holger Rauhut

Holger Rauhut

Prof. Dr.

Mathematical Data Science and Artificial Intelligence

A2 | Mathematical Foundations


[8]
S. Bamberger, R. Heckel and F. Krahmer.
Approximating Positive Homogeneous Functions with Scale Invariant Neural Networks.
Preprint at arXiv (Aug. 2023). arXiv.
MCML Authors
Link to Reinhard Heckel

Reinhard Heckel

Prof. Dr.

Machine Learning

A2 | Mathematical Foundations

Link to Felix Krahmer

Felix Krahmer

Prof. Dr.

Optimization & Data Analysis

A2 | Mathematical Foundations


[7]
T. Fuchs, F. Krahmer and R. Kueng.
Greedy-type sparse recovery from heavy-tailed measurements.
International Conference on Sampling Theory and Applications (SampTA 2023). Yale, CT, USA, Jul 10-14, 2023. DOI.
MCML Authors
Link to Felix Krahmer

Felix Krahmer

Prof. Dr.

Optimization & Data Analysis

A2 | Mathematical Foundations


[6]
F. Hoppe, F. Krahmer, C. M. Verdun, M. I. Menzel and H. Rauhut.
Sampling Strategies for Compressive Imaging Under Statistical Noise.
International Conference on Sampling Theory and Applications (SampTA 2023). Yale, CT, USA, Jul 10-14, 2023. DOI.
MCML Authors
Link to Felix Krahmer

Felix Krahmer

Prof. Dr.

Optimization & Data Analysis

A2 | Mathematical Foundations

Link to Claudio Mayrink Verdun

Claudio Mayrink Verdun

Dr.

* Former member

A2 | Mathematical Foundations

Link to Holger Rauhut

Holger Rauhut

Prof. Dr.

Mathematical Data Science and Artificial Intelligence

A2 | Mathematical Foundations


[5]
R. Joy, F. Krahmer, A. Lupoli and R. Ramakrishan.
Quantization of Bandlimited Functions Using Random Samples.
International Conference on Sampling Theory and Applications (SampTA 2023). Yale, CT, USA, Jul 10-14, 2023. DOI.
MCML Authors
Link to Felix Krahmer

Felix Krahmer

Prof. Dr.

Optimization & Data Analysis

A2 | Mathematical Foundations


[4]
F. Krahmer, H. Lyu, R. Saab, A. Veselovska and R. Wang.
Quantization of Bandlimited Graph Signals.
International Conference on Sampling Theory and Applications (SampTA 2023). Yale, CT, USA, Jul 10-14, 2023. DOI.
MCML Authors
Link to Felix Krahmer

Felix Krahmer

Prof. Dr.

Optimization & Data Analysis

A2 | Mathematical Foundations

Link to Hanna Veselovska

Hanna Veselovska

Dr.

Optimization & Data Analysis

A2 | Mathematical Foundations


[3]
F. Krahmer and A. Veselovska.
Digital Halftoning via Mixed-Order Weighted Σ∆ Modulation.
International Conference on Sampling Theory and Applications (SampTA 2023). Yale, CT, USA, Jul 10-14, 2023. DOI.
MCML Authors
Link to Felix Krahmer

Felix Krahmer

Prof. Dr.

Optimization & Data Analysis

A2 | Mathematical Foundations

Link to Hanna Veselovska

Hanna Veselovska

Dr.

Optimization & Data Analysis

A2 | Mathematical Foundations


[2]
F. Krahmer and A. Veselovska.
Enhanced Digital Halftoning via Weighted Sigma-Delta Modulation.
SIAM Journal on Imaging Sciences 16.3 (Jul. 2023). DOI.
MCML Authors
Link to Felix Krahmer

Felix Krahmer

Prof. Dr.

Optimization & Data Analysis

A2 | Mathematical Foundations

Link to Hanna Veselovska

Hanna Veselovska

Dr.

Optimization & Data Analysis

A2 | Mathematical Foundations


[1]
J. Kostin, F. Krahmer and D. Stöger.
How robust is randomized blind deconvolution via nuclear norm minimization against adversarial noise?.
Preprint at arXiv (Mar. 2023). arXiv.
MCML Authors
Link to Felix Krahmer

Felix Krahmer

Prof. Dr.

Optimization & Data Analysis

A2 | Mathematical Foundations