21

Feb

Teaser image to Large-scale pretraining: the nitty-gritty details

Large-scale pretraining: the nitty-gritty details

Robert Baldock, Aleph Alpha

   21.02.2024

   2:15 pm - 3:45 pm

   LMU Department of Statistics and via zoom

This talk will give a rare close-up of the nitty-gritty details that go into training large-scale LLMs. In the autumn of 2023, Aleph Alpha Research Lab prepared to train their next generation of large language models, which are training now.

In this talk, Robert Baldock will chronicle their learnings from this process. In particular, he will describe their experiments to optimise the architecture and pretraining, their optimal scaling study, insights about efficient and numerically stable parallel training, tokenizer construction, and the preparation of the large-scale web-crawl dataset.


Related

Link to tba

Colloquium  •  03.07.2024  •  LMU Department of Statistics and via zoom

tba


Link to tba

Colloquium  •  26.06.2024  •  LMU Department of Statistics and via zoom

tba


Link to tba

Colloquium  •  19.06.2024  •  LMU Department of Statistics and via zoom

tba


Link to tba

Colloquium  •  05.06.2024  •  LMU Department of Statistics and via zoom

tba


Link to Explainable Methods for Reinforcement Learning

Colloquium  •  03.06.2024  •  LMU Department of Statistics and via zoom

Explainable Methods for Reinforcement Learning

The talk covers some of the main challenges for developing explainable DRL methods, focusing on the difference between supervised and reinforcement learning.