30.10.2025
©Terzo Algeri/Fotoatelier M/ TUM
Language Shapes Gender Bias in AI Images
TUM News
Alexander Fraser, MCML PI, and his team discovered that AI image generators reproduce gender stereotypes differently across languages. In their study of nine languages, they found that generic prompts like “accountant” mostly produced male images, while explicitly feminine or neutral prompts reduced bias but sometimes affected image quality.
The study highlights that AI is not language‑agnostic and careful wording can influence outcomes, underlining the need for fairness and multilingual awareness in AI systems.
Related
20.11.2025
Zigzag Your Way to Faster, Smarter AI Image Generation
ZigMa, introduced by Björn Ommer’s group at ECCV 24, improves high-res AI image and video generation with fast, memory-efficient zigzag scanning.
13.11.2025
Anne-Laure Boulesteix Among the World’s Most Cited Researchers
MCML PI Anne‑Laure Boulesteix named Highly Cited Researcher 2025 for cross-field work, among 17 LMU scholars recognized globally.
13.11.2025
Björn Ommer Featured in Frankfurter Rundschau
Björn Ommer highlights how Google’s new AI search mode impacts publishers, content visibility, and the diversity of online information.
13.11.2025
Fabian Theis Among the World’s Most Cited Researchers
Fabian Theis is named a Highly Cited Researcher 2025 for his work in mathematical modeling of biological systems.