Home  | Publications | LPB+26a

Variation Is the Norm: Embracing Sociolinguistics in NLP

MCML Authors

Abstract

In Natural Language Processing (NLP), variation is typically seen as noise and 'normalised away' before processing, even though it is an integral part of language. Conversely, studying language variation in social contexts is central to sociolinguistics. We present a framework to combine the sociolinguistic dimension of language with the technical dimension of NLP. We argue that by embracing sociolinguistics, variation can actively be included in a research setup, in turn informing the NLP side. To illustrate this, we provide a case study on Luxembourgish, an evolving language featuring a large amount of orthographic variation, demonstrating how NLP performance is impacted. The results show large discrepancies in the performance of models tested and fine-tuned on data with a large amount of orthographic variation in comparison to data closer to the (orthographic) standard. Furthermore, we provide a possible solution to improve the performance by including variation in the fine-tuning process. This case study highlights the importance of including variation in the research setup, as models are currently not robust to occurring variation. Our framework facilitates the inclusion of variation in the thought-process while also being grounded in the theoretical framework of sociolinguistics.

inproceedings LPB+26a


LREC 2026

15th International Conference on Language Resources and Evaluation. Palma de Mallorca, Spain, May 11-16, 2026. To be published. Preprint available.

Authors

A.-M. Lutgen • A. Plum • V. BlaschkeB. Plank • C. Purschke

Links

arXiv

Research Area

 B2 | Natural Language Processing

BibTeXKey: LPB+26a

Back to Top