05.05.2026
MunichNLP X MCML Meetup
Short Recap
In collaboration with MunichNLP, MCML hosted the April edition of the MunichNLP Meetup, welcoming around 50 participants. The event once again highlighted the strong interest in high-quality academic and science-oriented discussions on AI within the community.
The first talk by Yejin Choi (Stanford University) focused on reasoning in large language models. She discussed the limitations of pure scaling approaches and emphasized the importance of carefully designed reinforcement learning pipelines, diverse and high-quality data, and maintaining entropy during training. A key message was that smaller models, when trained thoughtfully, can achieve competitive performance, supporting the broader goal of making AI more accessible.
The second talk by Kathy Hämmerl (Technical University of Munich) addressed cross-lingual representations in multilingual models. She highlighted the strong English bias in current systems and presented approaches such as token alignment, contrastive learning, and fine-tuning with parallel data to improve knowledge transfer across languages. This line of research contributes to advancing more inclusive and globally applicable AI systems.
Thanks to all speakers, participants, and supporters for making this event possible!
Related
30.04.2026
MCML Stammtisch - Recap
The MCML-Stammtisch was a great opportunity to connect our members across disciplines and enjoy some relaxed conversations outside the labs
27.04.2026
Girls' Day 2026
At Girls’ Day 2026, students explored AI in everyday life through interactive sessions on deepfakes, bias, and the future of work.
23.04.2026
Teaching NLP in the LLM Era: Workshop TeachNLP 2026
MCML member Matthias Aßenmacher co-organized TeachNLP 2026, a workshop on designing NLP courses in the era of generative AI and LLMs.