Channel – Talks (conferences, seminars, keynotes)

Channel – simulation results

Channel – Tutorials


Crosscut through wind turbine/rotor realised as cylindrical shapes. The code relies on ExaHyPE and uses an immersed boundary method by Dumbser et al to map the shape onto the Cartesian mesh. Video created by Maurizio Tavelli resulting from joint work with Michael Dumbser (Trento) and Leonhard Rannabauer (TUM).
An illustration of our particle-in-cell algorithm for dynamically adaptive grids. The actual implementation follows Weinzierl, Verley, Henri and Roose: “Two particle-in-grid realisations on spacetrees” which is published at An arXiv preprint can be found at Our code has some unique selling points compared to other approaches: We support totally dynamic AMR but do not have to know what the load decomposition is and what the global AMR grid looks like. We support suprathermal particles, i.e. particles that suddenly accelerate and thus jump over multiple cells in one time step (tunnelling). We realise our stuff single-touch, i.e. each particle is loaded into the processor and its cache only once per time step.
Toy problem that we ship with ExaHyPE now. Solves the Euler equations on the unit square with an adaptive mesh, where the Durham Coat of Arms acts as initial condition and the ExaHyPE logo is injected as additional (time-dependent/rupture-type) stimulus.
The adaptive Cartesian grid we use to simulation molecule breakups. Actually, we do not use these grids, but we rewrite the Schrodinger equations into cascades of Helmholtz equations and then do some basis transformation. The approach is sketched in Reps and Weinzierl: “Complex additive geometric multilevel solvers for Helmholtz equations on spacetrees”, but our major contribution is the construction of a stable, scaling Helmholtz preconditioner which can solve the arising high-dimensional problems efficiently on manycore systems. The paper appeared in TOMS: You can find a preprint at
Very simple Sedov shock SPH simulation. The particle management relies on my particle-in-tree algorithms, while the actual Physics kernels are taken from the open source software Swift.
An ExaHyPE simulation grid spanned by Peano. We embed the rotor and prescribe its motion. The code studies the cell centre and flags them as inside (not shown) or computes otherwise the distance to the rotor if the distance is smaller than a given threshold. This is a non-trivial setup, as the embedded rotor mesh is very fine, and it changes every time step. We thus use the techniques discussed in Krestenitis and Weinzierl: “A multi‐core ready discrete element method with triangles using dynamically adaptive multiscale grids” where we translate the distance finding as a combination of a bounding box approach, a functional minimisation, and a classic geometric collision test. The result scales on AVX/SSE units.
A video illustrating some of the ideas behind the paper “Block Fusion on Dynamically Adaptive Spacetree Grids for Shallow Water Waves” which we published in the Parallel Processing Letters: The video uses the shallow water equations ran on the Durham Coat of Arms rather than a real tsunami setup.