I’m an Associate Professor in HPC/Scientific Computing at the Department of Computer Science (former School of Engineering and Computing Sciences) at Durham University. Prior to joining Durham, I graduated with a PhD from Technische Universität München (TUM) in Germany and served there as PostDoc Scientific Project Manager for international research consortia in collaboration with KAUST and the Munich Centre of Advanced Computing. Throughout that time, I also obtained a habilitation degree from TUM.
My objective in research is to find novel algorithms and clever implementations for applications from scientific computing that today are too hard to solve as we don’t have the right software. Often, existing solvers lack the required computational efficiency, hardware-awareness, anticipation of data structuredness and distribution, ability to handle the required data cardinality, or software maturity. I want to change this: My work shall enable others to simulate physical phenomena or study engineering challenges with unprecedented speed, accuracy and details.
Within the multifaceted world of computer-enabled sciences, I focus on data flow/movement (minimisation), data structure (organisation) and programming paradigm challenges. I also try to proof the correctness and efficiency of implementations I propose. Over the last years, data-driven approaches finally gained importance. Examples are in particular online (auto-)tuning and regularisation through real data. All in all, I work primarily on methodological/theoretic aspects of the computer science side of scientific computing. Yet, I anticipate ideas from hardware-aware performance engineering, pick up the latest trends in mathematics and consider application knowledge, too. I am particularly interested in dynamically adaptive multiscale methods based upon spacetrees that interact with multigrid solvers for partial differential equations, that host particle systems with particles of varying cut-off radii or size, or carry Finite Volume-alike discretisations. Besides the core algorithm and data challenges, I am fascinated by on-the-fly visualisation and in-situ postprocessing as well as dynamic load balancing.
Whenever possible, I make my research endeavours lead into open source software. My two major research codes are the Peano PDE solver framework and the C++ precompiler DaStGen which is an example for a (very simple) HPC-specific language extension. Both are used in the ExaHyPE project, e.g. Another code we currently work on is the Δ code (pronounced Delta), a triangle-based contact detection toolkit.
Four credos shape my work:
- It is important to use state-of-the-art mathematics when we study algorithms and their implementation. Multiscale/-grid methods with adaptive discretisations are my particular favourites. It does not make sense to invest effort into suboptimal mathematics.
- It is important to use real-world data. It does not make sense to study idealised data. On the one hand, this means working with large data. On the other hand, it means working with data of the right character. I’m personally not a big fan of plain pattern matching, but often it is for example quite intriguing to couple first principle physics or models with patterns/characteristics that we observe on-the-fly.
- It is important to find an efficient implementation. The best algorithm and the best quality data are of limited value if there is no efficient implementation and processing.
- It is important to deliver plain and verifiable correct implementations which are made freely available. Otherwise, others cannot adopt ideas and results are difficult to reproduce and to validate. Wherever possible, we have to provide performance, robustness and data access models for our codes.