Dates: Wednesday, October 15, 2025 - February 4, 2026
Organizers: Barbara Verfürth, Herbert Koch and Illia Karabash
Venue: Lipschitzsaal, Mathezentrum, Endenicher Allee 60, 53115 Bonn
Date
Hausdorff Tea
Hausdorff Colloquium
Graduate Colloquium
04.02.2026
15:00
15:15
Speaker TBA
Title TBA
James Wright (University of Edingburgh, Schottland): "Recent progress in pointwise ergodic theory"
We survey recent results establishing pointwise almost everywhere convergence of ergodic averages. We will make connections between these developments and advances in quantitative bounds for polynomial progressions in dense sets of integers.
Michael Alexis (MI) "Calculus teachers hate him because of this one weird trick, find out why!!"
I’ll present the concept of dyadic decomposition, a simple but stupidly effective technique for quickly estimating various sums and integrals, all without computing a single anti-derivative. We’ll discuss topics ranging from the p-test in Calculus to the Calderon-Zygmund decomposition for estimating averaging operators and the Hardy-Littlewood maximal function.
Mario Ohlberger (University of Münster, Germany): "Reduced Order Surrogate Models for PDE-Constrained Optimization and Inverse Problems"
Classically, model order reduction for parameterized systems is based on a so-called offline phase, where reduced approximation spaces are constructed and the reduced parameterized
system is built, followed by an online phase, where the reduced system can be cheaply evaluated in a multi-query context. In this contribution, instead, we follow an active learning or
enrichment approach where a multi-fidelity hierarchy of reduced order models is constructed on-the-fly while exploring a parameterized system. To this end we focus on learning based
reduction methods in the context of PDE constrained optimization and inverse problems and evaluate their overall efficiency. We discuss learning strategies, such as adaptive enrichment
within a trust region optimization framework as well as a combination of reduced order models with machine learning approaches. Concepts of rigorous certification and convergence will be
presented, as well as numerical experiments that demonstrate the efficiency of the proposed approaches.