Logo Institute for Data Science Foundations
Institute for Data Science Foundations
Logo Institute for Data Science Foundations
Institute for Data Science Foundations
DE

Conference on
Mathematics of Machine Learning 2025

September 22 - 25, 2025

Hamburg University of Technology (TU Hamburg)

Audimax II
Denickestraße 22
21073 Hamburg
Germany

Conference Program

Monday, Sept 22, 2025
09:30 - 09:50 Welcome address
09:50 - 10:40 Gabriele Steidl (TU Berlin, Germany)

Telegrapher’s Generative Model via Kac Flows ▼

We propose a new generative model based on the damped wave equation, also known as telegrapher’s equation. Similar to the diffusion equation and Brownian motion, there is a Feynman-Kac type relation between the telegrapher’s equation and the stochastic Kac process in 1D. The Kac flow evolves stepwise linearly in time, so that the probability flow is Lipschitz continuous in the Wasserstein distance and, in contrast to diffusion flows, the norm of the velocity is globally bounded. Furthermore, the Kac model has the diffusion model as its asymptotic limit. We extend these considerations to a multi-dimensional stochastic process which consists of independent 1D Kac processes in each spatial component. We show that this process gives rise to an absolutely continuous curve in the Wasserstein space and compute the conditional velocity field starting in a Dirac point analytically. Using the framework of flow matching, we train a neural network that approximates the velocity field and use it for sample generation. Our numerical experiments demonstrate the scalability of our approach, and show its advantages over diffusion models. This is joint work with Richard Duong, Jannis Chemseddine and Peter K. Friz.
10:40 - 11:10 Coffee Break
11:10 - 11:35 Christoph Lampert (Institute of Science and Technology, Austria)

Generalization Guarantees for Multi-task and Meta-learning ▼

tba
11:35 - 12:00 Simon Weissmann (University of Mannheim, Germany)

Almost sure convergence rates for stochastic gradient methods ▼

tba
12:00 - 13:00 Lunch
13:00 - 13:50 Lenaic Chizat (EPFL, Switzerland)

Title: tba ▼

tba
13:50 - 14:15 Viktor Stein (TU Berlin, Germany)

Wasserstein Gradient Flows for Moreau Envelopes of f-Divergences in Reproducing Kernel Hilbert Spaces ▼

tba
14:15 - 14:40 Kainth Rishi Sonthalia (Boston College, USA)

Generalization with Non-Standard Spectra ▼

tba
14:40 - 15:10 Coffee Break
15:10 - 16:00 Misha Belkin (University of California San Diego, USA)

Title: tba ▼

tba
16:00 - 16:25 Armin Iske (University of Hamburg, Germany)

On the Convergence of Multiscale Kernel Regression under Minimalistic Assumptions ▼

tba
16:25 - 16:50 Christoph Brune (University of Twente, Netherlands)

Deep Networks are Reproducing Kernel Chains ▼

tba
16:50 - 17:20 Coffee Break
17:20 - 17:45 Marcello Carioni (University of Twente, Netherlands)

Atomic Gradient Descents ▼

tba
17:45 - 18:10 Nisha Chandramoorthy (University of Chicago, USA)

When, why and how are some generative models robust? ▼

tba
Tuesday, Sept 23, 2025
09:00 - 09:50 Stefanie Jegelka (MIT, USA, and TU Munich, Germany)

Title: tba ▼

tba
09:50 - 10:15 Parvaneh Joharinad (Leipzig University and MPI for Mathematics in the Sciences, Germany)

Title: tba ▼

tba
10:15 - 10:40 Diaaeldin Taha (MPI for Mathematics in the Sciences, Germany)

Title: tba ▼

tba
10:40 - 11:10 Coffee Break
11:10 - 11:35 Amanjit Singh (University of Toronto, Canada)

Bregman-Wasserstein gradient flows ▼

tba
11:35 - 12:00 Adwait Datar (Hamburg University of Technology, Germany)

Does the Natural Gradient Really Outperform the Euclidean Gradient? ▼

tba
12:00 - 13:00 Lunch
13:00 - 14:00 Poster Session
14:00 - 14:25 Semih Cayci (RWTH Aachen University, Germany)

Convergence of Gauss-Newton in the Lazy Training Regime: A Riemannian Optimization Perspective ▼

tba
14:25 - 14:50 Johannes Müller (TU Berlin, Germany)

Title: tba ▼

tba
14:50 - 15:15 Alexander Friedrich (Umeå University, Sweden)

A First Construction of Neural ODES on M-Polyfolds ▼

tba
15:15 - 15:40 Thomas Martinetz (University of Lübeck, Germany)

Good by Default? Generalization in Highly Overparameterized Networks ▼

tba
15:40 - 16:10 Coffee Break
16:10 - 17:00 Francis Bach (INRIA Paris Centre, France)

Denoising diffusion models without diffusions ▼

Denoising diffusion models have enabled remarkable advances in generative modeling across various domains. These methods rely on a two-step process: first, sampling a noisy version of the data—an easier computational task—and then denoising it, either in a single step or through a sequential procedure. Both stages hinge on the same key component: the score function, which is closely tied to the optimal denoiser mapping noisy inputs back to clean data. In this talk, I will introduce an alternative perspective on denoising-based sampling that bypasses the need for continuous-time diffusion processes. This framework not only offers a fresh conceptual angle but also naturally extends to discrete settings, such as binary data. Joint work with Saeed Saremi and Ji-Won Park (https://arxiv.org/abs/2305.19473, https://arxiv.org/abs/2502.00557).
19:00 - 22:00 Dinner
Wednesday, Sept 24, 2025
09:00 - 09:50 Gitta Kutyniok (LMU Munich, Germany)

Reliable and Sustainable AI: From Mathematical Foundations to Next Generation AI Computing ▼

The current wave of artificial intelligence is transforming industry, society, and the sciences at an unprecedented pace. Yet, despite its remarkable progress, today’s AI still suffers from two major limitations: a lack of reliability and excessive energy consumption. This lecture will begin with an overview of this dynamic field, focusing first on reliability. We will present recent theoretical advances in the areas of generalization and explainability -- core aspects of trustworthy AI that also intersect with regulatory frameworks such as the EU AI Act. From there, we will explore fundamental limitations of existing AI systems, including challenges related to computability and the energy inefficiency of current digital hardware. These challenges highlight the pressing need to rethink the foundations of AI computing. In the second part of the talk, we will turn to neuromorphic computing -- a promising and rapidly evolving paradigm that emulates biological neural systems using analog hardware. We will introduce spiking neural networks, a key model in this area, and share some of our recent mathematical findings. These results point toward a new generation of AI systems that are not only provably reliable but also sustainable.
09:50 - 10:15 Marco Mondelli (Institute of Science and Technology, Austria)

Title: tba ▼

tba
10:15 - 10:40 Yury Korolev (University of Bath, United Kingdom)

Large-time dynamics in transformer architectures with layer normalisation ▼

tba
10:40 - 11:10 Coffee Break
11:10 - 11:35 Leon Bungert (University of Würzburg, Germany)

Robustness on the interface of geometry and probability ▼

tba
11:35 - 12:00 Martin Lazar (University of Dubrovnik, Croatia)

Be greedy and learn: efficient and certified algorithms for parametrized optimal control problems ▼

tba
12:00 - 13:00 Lunch
13:00 - 14:00 Poster Session
14:00 - 14:50 Frank Nielsen (Sony Computer Science Laboratories Inc., Japan)

Title: tba ▼

tba
14:50 - 15:15 Vahid Shahverdi (KTH, Sweden)

Title: tba ▼

tba
15:15 - 15:40 Jesse van Oostrum (Hamburg University of Technology, Germany)

On the Natural Gradient of the Evidence Lower Bound ▼

tba
15:40 - 16:10 Coffee Break
16:10 - 17:00 Jürgen Jost (MPI for Mathematics in the Sciences, Germany)

Geometric and statistical methods of data analysis. In memoriam Sayan Mukherjee ▼

tba
17:00 - 17:25 Michael Murray (University of Bath, United Kingdom)

Title: tba ▼

tba
17:25 - 17:50 Sebastian Kassing (TU Berlin, Germany)

On the effect of acceleration and regularization in machine learning ▼

tba
Thursday, Sept 25, 2025
09:00 - 09:50 Markos Katsoulakis (University of Massachusetts Amherst, USA)

Title: tba ▼

tba
09:50 - 10:15 Pavel Gurikov (Hamburg University of Technology, Germany)

Physics-Informed Machine Learning for Sustainable Process Design: Predicting Solubility in Green Solvents ▼

tba
10:15 - 10:40 Sebastian Götschel (Hamburg University of Technology, Germany)

Hard-constraining Boundary Conditions for Physics-Informed Neural Operators ▼

tba
10:40 - 11:10 Coffee Break
11:10 - 11:35 Jan Gerken (Chalmers University of Technology, Sweden)

Emergent Equivariance in Deep Ensembles ▼

tba
11:35 - 12:00 Timm Faulwasser (Hamburg University of Technology, Germany)

The Optimal Control Perspective on Deep Neural Networks – Early Exits, Insights, and Open Problems ▼

tba
12:00 - 13:00 Lunch
13:00 - 13:50 Matus Telgarsky (New York University, USA)

Title: tba ▼

tba
13:50 - 14:15 Ahmed Abdeljawad (Radon Institute for Computational and Applied Mathematics, Austria)

Approximation Theory of Shallow Neural Networks ▼

tba
14:15 - 14:40 Jethro Warnett (University of Oxford, United Kingdom)

Stein Variational Gradient Descent ▼

tba
14:40 - 15:10 Coffee Break
15:10 - 16:00 Kenji Fukumizu (Institute of Statistical Mathematics, Japan)

Title: tba ▼

tba
16:00 - 16:25 Vitalii Konarovskyi (University of Hamburg, Germany)

Stochastic Modified Flows, Mean-Field Limits and Dynamics of Stochastic Gradient Descent ▼

tba
16:25 - 16:50 Tim Jahn (TU Berlin, Germany)

Learning Jump–Diffusion Dynamics from Irregularly-Sampled Data via Trajectory Generator Matching ▼

tba
16:50 - 17:20 Coffee Break
17:20 - 17:45 Gianluca Finocchio (University of Vienna, Austria)

Model-Free Identification in Ill-Posed Regression ▼

tba
17:45 - 18:10 Marzieh Eidi (MPI for Mathematics in the Sciences/ScaDS AI Institute, Germany)

Geometric learning in complex networks ▼

tba