Home  | Publications | KS20

Negated and Misprimed Probes for Pretrained Language Models: Birds Can Talk, but Cannot Fly

MCML Authors

Link to Profile Hinrich Schütze PI Matchmaking

Hinrich Schütze

Prof. Dr.

Principal Investigator

Abstract

Building on Petroni et al. 2019, we propose two new probing tasks analyzing factual knowledge stored in Pretrained Language Models (PLMs). (1) Negation. We find that PLMs do not distinguish between negated (‘‘Birds cannot [MASK]”) and non-negated (‘‘Birds can [MASK]”) cloze questions. (2) Mispriming. Inspired by priming methods in human psychology, we add “misprimes” to cloze questions (‘‘Talk? Birds can [MASK]”). We find that PLMs are easily distracted by misprimes. These results suggest that PLMs still have a long way to go to adequately learn human-like factual knowledge.

inproceedings


ACL 2020

58th Annual Meeting of the Association for Computational Linguistics. Virtual, Jul 05-10, 2020.
Conference logo
A* Conference

Authors

N. KassnerH. Schütze

Links

DOI

Research Area

 B2 | Natural Language Processing

BibTeXKey: KS20

Back to Top