Home  | Publications | LFW+23

PVGRU: Generating Diverse and Relevant Dialogue Responses via Pseudo-Variational Mechanism

MCML Authors

Link to Profile Hinrich Schütze PI Matchmaking

Hinrich Schütze

Prof. Dr.

Principal Investigator

Abstract

We investigate response generation for multi-turn dialogue in generative chatbots. Existing generative modelsbased on RNNs (Recurrent Neural Networks) usually employ the last hidden state to summarize the history, which makesmodels unable to capture the subtle variability observed in different dialogues and cannot distinguish the differencesbetween dialogues that are similar in composition. In this paper, we propose Pseudo-Variational Gated Recurrent Unit (PVGRU). The key novelty of PVGRU is a recurrent summarizing variable thataggregates the accumulated distribution variations of subsequences. We train PVGRU without relying on posterior knowledge, thus avoiding the training-inference inconsistency problem. PVGRU can perceive subtle semantic variability through summarizing variables that are optimized by two objectives we employ for training: distribution consistency and reconstruction. In addition, we build a Pseudo-Variational Hierarchical Dialogue(PVHD) model based on PVGRU. Experimental results demonstrate that PVGRU can broadly improve the diversity andrelevance of responses on two benchmark datasets.

inproceedings


ACL 2023

61th Annual Meeting of the Association for Computational Linguistics. Toronto, Canada, Jul 09-14, 2023.
Conference logo
A* Conference

Authors

Y. Liu • S. Feng • D. Wang • Y. Zhang • H. Schütze

Links

DOI

Research Area

 B2 | Natural Language Processing

BibTeXKey: LFW+23

Back to Top