For best experience please turn on javascript and use a modern browser!
You are using a browser that is no longer supported by Microsoft. Please upgrade your browser. The site may not present itself correctly if you continue browsing.
Astrophysics deals with immense scales of time and space, requiring calculations that push the limits of traditional computing. At API, we employ advanced numerical simulations and leverage supercomputers capable of performing billions of operations per second. This enables us to model complex systems like turbulent interstellar gas dynamics, powerful cosmic explosions, plasma-physical processes around black holes and gravitational waves.

At API, numerical simulations are used in two ways: 

  • in synthetic data pipelines which make predictions of the actual observational quantities

  • to understand the underlying physical interactions that govern complex systems such as multiple stellar systems and turbulent (magneto-) hydrodynamical flows 

In order to gather new knowledge, we need to constantly extend our simulation tools to be able to deal with bigger scale separations or to incorporate new physical effects.  To this end, API researchers are playing a leading role in the development of several software packages in use by the astrophysical community.  Some of these codes have been run on the biggest supercomputers in the world, leading to breakthrough insights on the astrophysical systems.  

Current

API researchers are heavily involved in generating theoretical models for the Event Horizon Telescope Collaboration and contribute to two of the simulation codes of the collaboration (BHAC and H-AMR).  Research groups at API also develop and apply full dynamical spacetime simulations to astrophysical multimessenger sources such as binary neutron star mergers and supernova explosions (using e.g. GR-AMR-X within the EinsteinToolkit).  Furthermore, state-of-the-art stellar dynamics and stellar evolution codes are actively developed, with an ongoing effort of integrating codes into a uniform software framework (the AMUSE project).  

Other fields at API where simulations play a key role are: modeling of planetary atmospheres, astrochemistry and planet formation.

API’s commitment to high-quality research software is also expressed through having two dedicated research software engineers (RSE’s) on staff. The expertise of our RSEs is available to all API staff via a regular call for projects. 

Keywords

High performance and high throughput computing, computational fluid dynamics, N-body systems, open source software

Facilities

Helios (API’s own 1000 core compute cluster), HIPSTER (the Faculty’s own GPU cluster), SURF: Snellius, SPIDER, LUMI, HPC Europe Tier-0 and other international facilities in collaboration (Incite/DoE, Compute Canada)

Leading scientists

Dr. A. (Alessandra) Candian

Faculty of Science

Anton Pannekoek Institute for Astronomy

Prof. dr. C. (Carsten) Dominik

Faculty of Science

Anton Pannekoek Institute for Astronomy

Prof. dr. S.B. (Sera) Markoff

Faculty of Science

Anton Pannekoek Institute for Astronomy

Dr. P. (Philipp) Moesta

Faculty of Science

Anton Pannekoek Institute for Astronomy

Dr. A. (Antonija) Oklopčić

Faculty of Science

Anton Pannekoek Institute for Astronomy

Dr. O.J.G. (Oliver) Porth

Faculty of Science

Anton Pannekoek Institute for Astronomy

Dr. S.G.M. (Silvia) Toonen

Faculty of Science

Anton Pannekoek Institute for Astronomy

Scientific software support

 

Dr E. (Evert) Rol

Faculty of Science

Anton Pannekoek Institute for Astronomy

Dr. S. (Steven) Rieder

Faculty of Science

Anton Pannekoek Institute for Astronomy