Home  | Publications | NYM+25

Decomposed Prompting: Probing Multilingual Linguistic Structure Knowledge in Large Language Models

MCML Authors

Link to Profile Frauke Kreuter PI Matchmaking

Frauke Kreuter

Prof. Dr.

Principal Investigator

Link to Profile Hinrich Schütze PI Matchmaking

Hinrich Schütze

Prof. Dr.

Principal Investigator

Abstract

Probing the multilingual knowledge of linguistic structure in LLMs, often characterized as sequence labeling, faces challenges with maintaining output templates in current text-to-text prompting strategies. To solve this, we introduce a decomposed prompting approach for sequence labeling tasks. Diverging from the single text-to-text prompt, our prompt method generates for each token of the input sentence an individual prompt which asks for its linguistic label. We test our method on the Universal Dependencies part-of-speech tagging dataset for 38 languages, using both English-centric and multilingual LLMs. Our findings show that decomposed prompting surpasses the iterative prompting baseline in efficacy and efficiency under zero- and few-shot settings. Moreover, our analysis of multilingual performance of English-centric LLMs yields insights into the transferability of linguistic knowledge via multilingual prompting.

inproceedings NYM+25


IJCNLP 2025

14th International Joint Conference on Natural Language Processing. Mumbai, India, Dec 20-24, 2025.

Authors

E. Nie • S. Yuan • B. Ma • H. Schmid • M. Färber • F. KreuterH. Schütze

Links

URL

Research Areas

 B2 | Natural Language Processing

 C4 | Computational Social Sciences

BibTeXKey: NYM+25

Back to Top