Home  | Publications | LCB+25

PERSEVAL: A Framework for Perspectivist Classification Evaluation

MCML Authors

Abstract

Data perspectivism goes beyond majority vote label aggregation by recognizing various perspectives as legitimate ground truths. However, current evaluation practices remain fragmented, making it difficult to compare perspectivist approaches and analyze their impact on differ-ent users and demographic subgroups. To ad-dress this gap, we introduce PersEval, the first unified framework for evaluating perspectivist models in NLP. A key innovation is its evaluation at the individual annotator level and its treatment of annotators and users as dis-tinct entities, consistently with real-world scenarios. We demonstrate PersEval's capabilities through experiments with both Encoder-based and Decoder-based approaches, as well as an analysis of the effect of sociodemographic prompting. By considering global, text-, trait-and user-level evaluation metrics, we show that PersEval is a powerful tool for examining how models are influenced by user-specific in-formation and identifying the biases this information may introduce.

inproceedings


EMNLP 2025

Conference on Empirical Methods in Natural Language Processing. Suzhou, China, Nov 04-09, 2025. To be published. Preprint available.
Conference logo
A* Conference

Authors

S. M. Lo • S. Casola • E. Sezerer • V. Basile • F. Sansonetti • A. Uva • D. Bernardi

Links

PDF

Research Area

 B2 | Natural Language Processing

BibTeXKey: LCB+25

Back to Top