Home | Research | Groups | Christian Kühn

Research Group Christian Kühn


Link to website at TUM

Christian Kühn

Prof. Dr.

Associate

Multiscale and Stochastic Dynamics

Christian Kühn

leads the Multiscale and Stochastic Dynamics at TU Munich.

The research interests of his group are very broad and lie at the interface of differential equations, dynamical systems and mathematical modelling. In terms of application areas, the group works on a wide range of problems in areas such as biophysics, climate science, ecology, epidemiology, fluid dynamics, neuroscience, among others. In the context of machine learning, we are particularly interested in ‘mathematics for ML’, i.e., to try to understand, when AI is efficient and robust, or when it is prone to adversarial attacks. In fact, all machine learning algorithms can be viewed as dynamical systems, e.g., such as DNNs, transformers, etc are basically particle systems on networks. Also training algorithms are iterative mappings leading again to dynamics, e.g., SGD is a stochastic/random dynamical system.

Team members @MCML

PhD Students

Link to website

Sara-Viola Kuntz

Multiscale and Stochastic Dynamics

Publications @MCML

2025


[1]
C. Kühn and S.-V. Kuntz.
Analysis of the Geometric Structure of Neural Networks and Neural ODEs via Morse Functions.
DS 2025 - SIAM Conference on Applications of Dynamical Systems. Denver, CO, USA, May 11-15, 2025. To be published. Preprint available. arXiv
Abstract

Besides classical feed-forward neural networks, also neural ordinary differential equations (neural ODEs) have gained particular interest in recent years. Neural ODEs can be interpreted as an infinite depth limit of feed-forward or residual neural networks. We study the input-output dynamics of finite and infinite depth neural networks with scalar output. In the finite depth case, the input is a state associated with a finite number of nodes, which maps under multiple non-linear transformations to the state of one output node. In analogy, a neural ODE maps an affine linear transformation of the input to an affine linear transformation of its time-T map. We show that depending on the specific structure of the network, the input-output map has different properties regarding the existence and regularity of critical points, which can be characterized via Morse functions. We prove that critical points cannot exist if the dimension of the hidden layer is monotonically decreasing or the dimension of the phase space is smaller or equal to the input dimension. In the case that critical points exist, we classify their regularity depending on the specific architecture of the network. We show that except for a Lebesgue measure zero set in the weight space, each critical point is non-degenerate, if for finite depth neural networks the underlying graph has no bottleneck, and if for neural ODEs, the affine linear transformations used have full rank. For each type of architecture, the proven properties are comparable in the finite and the infinite depth case. The established theorems allow us to formulate results on universal embedding, i.e., on the exact representation of maps by neural networks and neural ODEs. Our dynamical systems viewpoint on the geometric structure of the input-output map provides a fundamental understanding of why certain architectures perform better than others.

MCML Authors
Link to Profile Christian Kühn

Christian Kühn

Prof. Dr.

Multiscale and Stochastic Dynamics

Link to website

Sara-Viola Kuntz

Multiscale and Stochastic Dynamics