Home  | Publications | GRL+24b

Adaptive Contrastive Search: Uncertainty-Guided Decoding for Open-Ended Text Generation

MCML Authors

Abstract

Decoding from the output distributions of large language models to produce high-quality text is a complex challenge in language modeling. Various approaches, such as beam search, sampling with temperature, k−sampling, nucleus p−sampling, typical decoding, contrastive decoding, and contrastive search, have been proposed to address this problem, aiming to improve coherence, diversity, as well as resemblance to human-generated text. In this study, we introduce adaptive contrastive search, a novel decoding strategy extending contrastive search by incorporating an adaptive degeneration penalty, guided by the estimated uncertainty of the model at each generation step. This strategy is designed to enhance both the creativity and diversity of the language modeling process while at the same time producing coherent and high-quality generated text output. Our findings indicate performance enhancement in both aspects, across different model architectures and datasets, underscoring the effectiveness of our method in text generation tasks. Our code base, datasets, and models are publicly available.

inproceedings


Findings @EMNLP 2024

Findings of the Conference on Empirical Methods in Natural Language Processing. Miami, FL, USA, Nov 12-16, 2024.
Conference logo
A* Conference

Authors

E. Garces Arias • J. Rodemann • M. Li • C. Heumann • M. Aßenmacher

Links

DOI

Research Area

 A1 | Statistical Foundations & Explainability

BibTeXKey: GRL+24b

Back to Top