Home  | Publications | Kt25

Neural Emulator Superiority: When Machine Learning for PDEs Surpasses Its Training Data

MCML Authors

Link to Profile Nils Thuerey PI Matchmaking

Nils Thuerey

Prof. Dr.

Principal Investigator

Abstract

Neural operators or emulators for PDEs trained on data from numerical solvers are conventionally assumed to be limited by their training data's fidelity. We challenge this assumption by identifying 'emulator superiority', where neural networks trained purely on low-fidelity solver data can achieve higher accuracy than those solvers when evaluated against a higher-fidelity reference. Our theoretical analysis reveals how the interplay between emulator inductive biases, training objectives, and numerical error characteristics enables superior performance during multi-step rollouts. We empirically validate this finding across different PDEs using standard neural architectures, demonstrating that emulators can implicitly learn dynamics that are more regularized or exhibit more favorable error accumulation properties than their training data, potentially surpassing training data limitations and mitigating numerical artifacts. This work prompts a re-evaluation of emulator benchmarking, suggesting neural emulators might achieve greater physical fidelity than their training source within specific operational regimes.

inproceedings KT25


NeurIPS 2025

39th Conference on Neural Information Processing Systems. San Diego, CA, USA, Nov 30-Dec 07, 2025. To be published.
Conference logo
A* Conference

Authors

F. Köhler • N. Thuerey

Links

URL GitHub

Research Area

 B1 | Computer Vision

BibTeXKey: KT25

Back to Top