Home  | Publications | VFM+25

Explaining Outliers Using Isolation Forest and Shapley Interactions

MCML Authors

Abstract

In unsupervised machine learning, Isolation Forest (IsoForest) is a widely used algorithm for the efficient detection of outliers. Identifying the features responsible for observed anomalies is crucial for practitioners, yet the ensemble nature of IsoForest complicates interpretation and comparison. As a remedy, SHAP is a prevalent method to interpret outlier scoring models by assigning contributions to individual features based on the Shapley Value (SV). However, complex anomalies typically involve interaction of features, and it is paramount for practitioners to distinguish such complex anomalies from simple cases. In this work, we propose Shapley Interactions (SIs) to enrich explanations of outliers with feature interactions. SIs, as an extension of the SV, decompose the outlier score into contributions of individual features and interactions of features up to a specified explanation order. We modify IsoForest to compute SI using TreeSHAP-IQ, an extension of TreeSHAP for tree-based models, using the shapiqpackage. Using a qualitative and quantitative analysis on synthetic and real-world datasets, we demonstrate the benefit of SI and feature interactions for outlier explanations over feature contributions alone.

inproceedings


ESANN 2025

European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning. Bruges, Belgium, Apr 23-25, 2025.

Authors

R. Visser • F. Fumagalli • M. MuschalikE. Hüllermeier • B. Hammer

Links

PDF

Research Area

 A3 | Computational Models

BibTeXKey: VFM+25

Back to Top