Home  | Publications | BRH26

Uncertainty Quantification in Pairwise Difference Learning for Classification

MCML Authors

Link to Profile Eyke Hüllermeier PI Matchmaking

Eyke Hüllermeier

Prof. Dr.

Principal Investigator

Abstract

Instead of learning a mapping from instances to outcomes in the standard way, the key idea of pairwise difference learning (PDL) is to learn a function that takes two instances as input and predicts the difference between their respective outcomes. Given a function of this kind, predictions for a query instance are derived from every training example and then aggregated. We consider pairwise difference learning for classification (PDC) and propose two uncertainty quantification methods that faithfully represent and measure the classifier’s predictive uncertainty. Building on recent work on decomposing total uncertainty into aleatoric and epistemic components, we propose different ways of extending corresponding measures to the setting of PDC, depending on how and to what extent the underlying base learner represents uncertainty. In large-scale empirical studies, we analyze both the predictive accuracy and the uncertainty awareness of the methods.

article BRH26


Machine Learning

115.12. Jan. 2023.
Top Journal

Authors

M. K. Belaid • M. Rabus • E. Hüllermeier

Links

DOI

Research Area

 A3 | Computational Models

BibTeXKey: BRH26

Back to Top