Home  | Publications | KS20a

BERT-KNN: Adding a KNN Search Component to Pretrained Language Models for Better QA

MCML Authors

Link to Profile Hinrich Schütze PI Matchmaking

Hinrich Schütze

Prof. Dr.

Principal Investigator

Abstract

Khandelwal et al. (2020) use a k-nearest-neighbor (kNN) component to improve language model performance. We show that this idea is beneficial for open-domain question answering (QA). To improve the recall of facts encountered during training, we combine BERT (Devlin et al., 2019) with a traditional information retrieval step (IR) and a kNN search over a large datastore of an embedded text collection. Our contributions are as follows: i) BERT-kNN outperforms BERT on cloze-style QA by large margins without any further training. ii) We show that BERT often identifies the correct response category (e.g., US city), but only kNN recovers the factually correct answer (e.g.,“Miami”). iii) Compared to BERT, BERT-kNN excels for rare facts. iv) BERT-kNN can easily handle facts not covered by BERT’s training set, e.g., recent events.

inproceedings


Findings @EMNLP 2020

Findings of the Conference on Empirical Methods in Natural Language Processing. Virtual, Nov 16-20, 2020.
Conference logo
A* Conference

Authors

N. KassnerH. Schütze

Links

DOI

Research Area

 B2 | Natural Language Processing

BibTeXKey: KS20a

Back to Top