Julia Herbinger
Dr.
* Former Member
Bayesian optimization (BO) has become indispensable for black box optimization. However, BO is often considered a black box itself, lacking transparency in the rationale behind proposed parameters. This is particularly relevant in human-in-the-loop applications like personalization of wearable robotic devices. We address BO’s opacity by proposing ShapleyBO, a framework for interpreting BO proposals by game-theoretic Shapley values. Our approach quantifies the contribution of each parameter to BO’s acquisition function (AF). By leveraging the linearity of Shapley values, ShapleyBO can identify the influence of each parameter on BO’s exploration and exploitation behaviors. Our method gives rise to a ShapleyBO-assistedhuman-machineinterface(HMI),allowing users to interfere with BO in case proposals do not align with human reasoning. We demonstrate these HMI’s benefits for the use case of personalizing wearable robotic devices (assistive back exosuits) by human-in-the-loop BO. Results suggest that human-BO teams with access to ShapleyBO outperform teams without access to ShapleyBO.
inproceedings
BibTeXKey: RCA+25