Home  | Publications | LCF22

M4 Adapter: Multilingual Multi-Domain Adaptation for Machine Translation With a Meta-Adapter

MCML Authors

Alexandra Chronopoulou

Dr.

Link to Profile Alexander Fraser PI Matchmaking

Alexander Fraser

Prof. Dr.

Principal Investigator

Abstract

Multilingual neural machine translation models (MNMT) yield state-of-the-art performance when evaluated on data from a domain and language pair seen at training time. However, when a MNMT model is used to translate under domain shift or to a new language pair, performance drops dramatically. We consider a very challenging scenario: adapting the MNMT model both to a new domain and to a new language pair at the same time. In this paper, we propose m4Adapter (Multilingual Multi-Domain Adaptation for Machine Translation with a Meta-Adapter), which combines domain and language knowledge using meta-learning with adapters. We present results showing that our approach is a parameter-efficient solution which effectively adapts a model to both a new language pair and a new domain, while outperforming other adapter methods. An ablation study also shows that our approach more effectively transfers domain knowledge across different languages and language information across different domains.

inproceedings


Findings @EMNLP 2022

Findings of the Conference on Empirical Methods in Natural Language Processing. Abu Dhabi, United Arab Emirates, Nov 07-11, 2022.
Conference logo
A* Conference

Authors

W. LaiA. ChronopoulouA. Fraser

Links

DOI

Research Area

 B2 | Natural Language Processing

BibTeXKey: LCF22

Back to Top