Home  | Publications | Ef25

Mask and You Shall Receive: Optimizing Masked Language Modeling for Pretraining BabyLMs

MCML Authors

Link to Profile Alexander Fraser PI Matchmaking

Alexander Fraser

Prof. Dr.

Principal Investigator

Abstract

We describe our strategy for the 2025 edition of the BabyLM Challenge. Our main contribution is that of an improved form of Masked Language Modeling (MLM), which adapts the probabilities of the tokens masked according to the model's ability to predict them. The results show a substantial increase in performance on (Super)GLUE tasks over the standard MLM. We also incorporate sub-token embeddings, finding that this increases the model's morphological generalization capabilities. Our submission beats the baseline in the strict-small track.

inproceedings EF25


BabyLM @EMNLP 2025

1st BabyLM Workshop: Accelerating Language Modeling Research with Cognitively Plausible Data at the Conference on Empirical Methods in Natural Language Processing. Suzhou, China, Nov 04-09, 2025.

Authors

L. EdmanA. Fraser

Links

DOI

Research Area

 B2 | Natural Language Processing

BibTeXKey: EF25

Back to Top