Home  | Publications | WLA+24

Better Call SAUL: Fluent and Consistent Language Model Editing With Generation Regularization

MCML Authors

Abstract

To ensure large language models contain up-to-date knowledge, they need to be updated regularly. However, model editing is challenging as it might also affect knowledge that is unrelated to the new data. State-of-the-art methods identify parameters associated with specific knowledge and then modify them via direct weight updates. However, these locate-and-edit methods suffer from heavy computational overhead and lack theoretical validation. In contrast, directly fine-tuning the model on requested edits affects the model's behavior on unrelated knowledge, and significantly damages the model's generation fluency and consistency. To address these challenges, we propose SAUL, a streamlined model editing method that uses sentence concatenation with augmented random facts for generation regularization. Evaluations on three model editing benchmarks show that SAUL is a practical and reliable solution for model editing outperforming state-of-the-art methods while maintaining generation quality and reducing computational overhead.

inproceedings WLA+24


Findings @EMNLP 2024

Findings of the Conference on Empirical Methods in Natural Language Processing. Miami, FL, USA, Nov 12-16, 2024.
Conference logo

Authors

M. Wang • L. Lange • H. Adel • J. Strötgen • H. Schütze

Links

DOI

In Collaboration

 Bosch


Research Area

 B2 | Natural Language Processing

BibTeXKey: WLA+24

Back to Top