Home  | Publications | RCA+25

Explaining Bayesian Optimization by Shapley Values Facilitates Human-AI Collaboration for Exosuit Personalization

MCML Authors

Abstract

Bayesian optimization (BO) has become indispensable for black box optimization. However, BO is often considered a black box itself, lacking transparency in the rationale behind proposed parameters. This is particularly relevant in human-in-the-loop applications like personalization of wearable robotic devices. We address BO’s opacity by proposing ShapleyBO, a framework for interpreting BO proposals by game-theoretic Shapley values. Our approach quantifies the contribution of each parameter to BO’s acquisition function (AF). By leveraging the linearity of Shapley values, ShapleyBO can identify the influence of each parameter on BO’s exploration and exploitation behaviors. Our method gives rise to a ShapleyBO-assistedhuman-machineinterface(HMI),allowing users to interfere with BO in case proposals do not align with human reasoning. We demonstrate these HMI’s benefits for the use case of personalizing wearable robotic devices (assistive back exosuits) by human-in-the-loop BO. Results suggest that human-BO teams with access to ShapleyBO outperform teams without access to ShapleyBO.

inproceedings


ECML-PKDD 2025

European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases. Porto, Portugal, Sep 15-19, 2025.
Conference logo
A Conference

Authors

J. Rodemann • F. Croppi • P. Arens • Y. SaleJ. HerbingerB. BischlE. Hüllermeier • T. Augustin • C. J. Walsh • G. Casalicchio

Links

DOI

Research Areas

 A1 | Statistical Foundations & Explainability

 A3 | Computational Models

BibTeXKey: RCA+25

Back to Top