Home  | Publications | Chr24

Efficient Multilingual and Domain Adaptation of Language Models Under Resource Constraints

MCML Authors

Alexandra Chronopoulou

Dr.

Abstract

This dissertation develops methods to improve natural language processing (NLP) systems for low-resource languages and diverse domains. For machine translation, it explores bilingual language models, static embeddings, and multilingual systems with adapters, achieving robust performance in low-resource settings. To enhance domain adaptation, it introduces hierarchical tree structures and efficient adapters, enabling better generalization and robustness to domain shifts. These approaches address data disparities and domain variability, advancing adaptable and efficient NLP systems. (Shortened).

phdthesis


Dissertation

LMU München. Jan. 2024

Authors

A. Chronopoulou

Links

DOI

Research Area

 B2 | Natural Language Processing

BibTeXKey: Chr24

Back to Top