This requires a wide variety of approaches: rapidly processing enormous amounts of data to take raw measurements from the telescope and turn them into data ready for science; exploring data sets to find previously unknown phenomena; carefully comparing observations to theoretical and numerical models, for example of stars and galaxies. With the advent of modern instrumentation, the data sets we explore can be enormously huge and also extremely complex: we may have to analyse billions of stars at the same time, or carefully control for the minute ways in which distorts the incoming light. At the same time, astrophysical simulations have also grown enormously in size and complexity, requiring new approaches to make them computationally feasible and to sift through the large amounts of information they produce. Combining these complex simulations with modern, heterogeneous datasets is one of the current major challenges in astronomy.
To tackle these challenges, modern astronomy turns to the rapid advances from statistics and computer science. At API, we use a broad variety of state-of-the-art statistical approaches including Bayesian (hierarchical) modelling, Simulation-Based Inference, spectral-timing, probabilistic programming and more to take advantage of the varied data sets we have access to. Similarly, machine learning has begun to play a crucial role in astronomical data processing, discovery and analysis. We implement machine learning for example in our pipelines to find interesting transients in radio data; to emulate and speed up complex astrophysical models for stars, proto-planetary disks and black holes; to mine large data sets for new astrophysical phenomena.
Astrostatistics, machine learning, artificial intelligence, time series analysis, spectral timing, Bayesian inference