Home  | Events

21

Feb

Teaser image to Large-scale pretraining: the nitty-gritty details

Colloquium

Large-Scale Pretraining: The Nitty-Gritty Details

Robert Baldock, Aleph Alpha

   21.02.2024

   2:15 pm - 3:45 pm

   LMU Department of Statistics and via zoom

This talk will give a rare close-up of the nitty-gritty details that go into training large-scale LLMs. In the autumn of 2023, Aleph Alpha Research Lab prepared to train their next generation of large language models, which are training now.

In this talk, Robert Baldock will chronicle their learnings from this process. In particular, he will describe their experiments to optimise the architecture and pretraining, their optimal scaling study, insights about efficient and numerically stable parallel training, tokenizer construction, and the preparation of the large-scale web-crawl dataset.


Related

Link to Distilling Heterogeneous Treatment Effects: Stable Subgroup Estimation in Causal Inference

AI Keynote Series  •  20.11.2025  •  Online via Zoom

Distilling Heterogeneous Treatment Effects: Stable Subgroup Estimation in Causal Inference

Join the lecture with Melody Huang from Political Science and Statistics & Data Science at Yale University.


Back to Top