Speaker
Description
Extracting maximal information from upcoming cosmological surveys is a pressing task on the journey to understanding phenomena such as neutrino mass, dark energy, and inflation. This can be achieved by both making advancements in methodology and by carefully combining multiple cosmological datasets.
In the first part of my talk I will discuss methodological advancements to obtain optimal constraints, focussing on field-level inference with differentiable forward modeling. I will first motivate this approach to both reconstruct the initial conditions of the Universe and to obtain cosmological constraints. I will then tackle one of the bottlenecks of this approach -- sampling a high-dimensional parameter space -- by presenting a novel method, Microcanonical Langevin Monte Carlo. This method is orders of magnitude more efficient than the traditional Hamiltonian Monte Carlo and will enable scaling field-level inference to the regime of upcoming surveys.
I will then discuss combining multiple cosmological datasets to break parameter degeneracies and calibrate systematics. In particular, I will present the HalfDome cosmological simulations, a set of large-volume simulations designed specifically to model the Universe from CMB to LSS for the joint analysis of Stage IV surveys. I will show how these simulations are being used to mitigate systematics, obtain tighter constraints on cosmological parameters, and as a playground for machine learning applications.