Optimizing the Extraction of Cosmological Information from the Latest Spectroscopic Redshift Surveys

Europe/Rome
Sexten Primary School

Sexten Primary School

Via Panorama 6, Sexten (Italy)
Description

Observational cosmology is approaching a new era with a surge of spectroscopic galaxy redshift surveys all aimed at understanding the accelerated expansion of the Universe. A few notable examples are the Dark Energy Spectroscopic Instrument (DESI) and the ESA Euclid mission, which are obtaining their first scientific results. The primary probe of these surveys will be the baryon acoustic oscillations (BAO) imprinted in the large-scale structure (LSS) traced by galaxies. Galaxy clustering and the BAO probe of the expansion history of the Universe were proven by the hugely successful surveys of the last decade (BOSS, eBOSS, WiggleZ). In combination with the Planck cosmic microwave background analyses, the advancements in theory and analysis methods of LSS with these surveys brought us to the current point of achieving percent-level constraints on the parameters that describe the flat ΛCDM cosmological model. However, the increasingly precise measurements have only deepened the enigma surrounding the Λ term and dark energy that has been open for over 30 years. This meeting aims to discuss the latest advancements in the analysis of the LSS, exploiting both lower- and higher-order statistics from an observational and theoretical point of view, and also considering all the issues and mitigation strategies that need to be faced in real data analysis.

The meeting is divided into three slots, one related to "Observations", one related to "LSS analysis", and one on "Simulations and Machine learning for LSS". We plan to have invited and contributed presentations, as well as a panel discussion.

 

Important information

  • The participation is limited to a maximum of 50 participants due to venue capacity constraints
  • The conference fee is € 230,00 (€ 200,00 for students)

 

Important deadlines

  • Registration opening and call for abstract submission: January 22nd, 2025
  • Deadline for registration and abstract submission: March 9, 2025 March 23, 2025
  • Selection of speakers and scientific programme: April 7th 2025

 

Invited Speakers:
  • Dida Markovic (JPL)
  • Will Percival (University of Waterloo)
  • Hector Gil Marin (Universitat de Barcelona)
  • Zachary Slepian (University of Florida)
  • Francisco Villaescusa-Navarro (Simons Foundation)
  • Carmelita Carbone (INAF)

 

Descrizione immagine

Registration
Participants
    • I day: Observations
      • 1
        Introduction
        Speakers: Michele Ennio Maria Moresco (Istituto Nazionale di Astrofisica (INAF)), Michele Moresco (University of Bologna)
      • 10:30
        Coffe break
      • 2
        Roman
        Speaker: Markovic
      • 3
        PFS SSP Cosmology - Survey and Target Selection
        Speaker: Shi
      • 4
        MOONRISE: the main MOONS GTO extragactic survey

        MOONS is the new Multi-Object Optical and Near-infrared Spectrograph for the Very Large Telescope (VLT) at ESO which will see its first light this year. This remarkable instrument combines, for the first time, the collecting power of an 8-m telescope, 1000 fibres with individual robotic positioners, and both low- and high-resolution simultaneous spectral coverage across the 0.64-1.8 μm wavelength range. MOONS will therefore provide the astronomical community with a powerful, world-leading instrument able to serve a wide range of Galactic, extragalactic and cosmological studies. In talk I will concentrate on the expected outcome of MOONRISE which is the main GTO extragalactic survey specifically aimed at obtaining key spectroscopic information for about half a million galaxies at 0.9 < z < 2.6, as well as for a few thousand galaxies around the epoch of reionisation (z ~ 6–8) and what this implies for Large-Scale Structure as well as environmental studies.

        Speaker: Manuela Magliocchetti (Istituto Nazionale di Astrofisica (INAF))
      • 5
        Finding the brightest QSOs with QUBRICS

        The QUBRICS (QUasars as BRIght beacons for Cosmology in the Southern hemisphere) survey is designed to produce a sample of the brightest quasars with 𝑧>∼ 2.5, observable with facilities in the Southern Hemisphere, taking advantage of a score of recent and forthcoming databases: Gaia, Skymapper, PAN-STARRS, WISE, Euclid, Rubin etc.
        Identifying high-redshift QSOs involves two challenges: first, distinguishing QSOs from stars, galaxies, and other celestial objects, and second, accurately estimating their redshifts to exclude low-redshift candidates.
        The QUBRICS project has devised a strategy that includes the creation of an efficient database to train both classification and regression models. The survey employs Machine Learning techniques specifically tailored for extracting the rare QSOs from the vast datasets of optical and infrared wide-field surveys, optimizing various approaches to maximize either completeness or success rates based on specific scientific goals.
        This talk outlines the lessons learned in the field of machine learning, which have potential applications across diverse fields beyond astrophysics. Furthermore, it presents new results about the evolution of the QSO/AGN luminosity function (in conjunction with recent JWST observations), the AGN contribution to the cosmic UV background and HI and HeII reionization, a pilot program on the Sandage test of the cosmic redshift drift using the two brightest QSOs in the southern hemisphere and the Espresso QUasar Absorption Line Survey.

        Speaker: Prof. Stefano Cristiani (Istituto Nazionale di Astrofisica (INAF))
      • 13:00
        Lunch
      • 6
        Cosmological results from DESI
        Speaker: Percival
      • 7
        Evaluating Systematic Effects Using Simulations and Real Data in Euclid

        Galaxy clustering analysis relies on precise knowledge of the purity and completeness of cosmological data samples. These properties can be assessed with two approaches: simulations and deeper observations of specific fields.

        Simulations provide a controlled environment where the true input is known. However, they come with two major issues: they must be computationally efficient, and they must account for all instrumental features. Moreover, certain detector non-idealities, such as persistence and snowballs, cannot be fully understood and modeled, making it impossible to achieve completely realistic simulations.

        On the other hand, deeper observations are limited to small regions of the sky. Their purity and completeness are difficult to evaluate, and in the case of Euclid, the full survey depth required for their assessment could only be reached at the end of the mission. Analysis can be performed using fields with extensive spectroscopic redshift measurements, such as COSMOS. However, these fields are small, and the selection function of spectroscopic catalogs is often complex and not fully representative of the larger data set.

        Thus, we present an alternative approach to characterise the cosmological sample by injecting simulated spectra into Euclid spectroscopic images. These images are then processed through the full pipeline up to redshift determination. Instead of explicitly simulating systematic effects, this method naturally inherits them from real observations. This approach is computationally demanding but enables the evaluation of purity and completeness over a larger sky area compared to external datasets and represents a complementary strategy to assess systematics.

        Speaker: Francesca Passalacqua (INFN & University of Padova)
      • 8
        Characterizing selection effects in Stage IV spectroscopic surveys

        To effectively extract cosmological information from
        spectroscopic surveys, it is crucial to accurately characterize their
        selection function. This is typically achieved through the use of a
        random catalog, which serves as a Monte Carlo realization of the
        selection function. In this presentation, I will outline the forward
        modeling approach utilized in the creation of the random catalog for the
        Euclid spectroscopic survey, emphasizing the challenges inherent in
        early data and, particularly, the issues arising from the lack of a
        highly pure and complete calibrator at this preliminary stage.

        Speaker: Antonio Farina (Istituto Nazionale di Astrofisica (INAF))
      • 16:00
        Coffe break
      • 9
        TBD
        Speaker: Benjamin Rudolph Granett (Istituto Nazionale di Astrofisica (INAF))
      • 10
        Observational Systematics in Euclid
        Speaker: Ilaria Risso (Istituto Nazionale di Astrofisica (INAF))
      • 11
        Discussion (?)
    • II day: LSS analysis
      • 12
        The Full-Shape analayis of the Dark Energy Spectroscopic Instrument data release 1
        Speaker: Gil-Marin
      • 13
        Towards an optimal marked correlation function analysis for the detection of modified gravity

        The marked two-point correlation function, which is particularly sensitive to the surrounding environment, offers a promising approach to enhancing the discriminating power in clustering analyses and to potentially detecting modified gravity (MG) signals. In this talk I will present my work that investigates novel marks based on large-scale environment estimates, which also exploit the anti-correlation between objects in low- and high-density regions. This is the first time that the propagation of discreteness effects in marked correlation functions is investigated in depth. The density-dependent marked correlation function estimated from catalogues is affected by shot noise in a non-trivial way. We assess the performance of various marks to distinguish general relativity (GR) from MG. This is achieved through the use of the ELEPHANT suite of simulations, which comprise five realisations of GR and two different MG theories: $f(R)$ and nDGP. In addition, discreteness effects are thoroughly studied using the high-density Covmos catalogues. We have established a robust method to correct for shot-noise effects that can be used in practical analyses. This method allows the recovery of the true signal, with an accuracy below 5% over the scales of $5 \,h^{−1}$ Mpc up to $150 \,h^{−1}$ Mpc. Furthermore, we demonstrate that marks that anti-correlate objects in low- and high-density regions are among the most effective in distinguishing between MG and GR; they also uniquely provide visible deviations on large scales, up to about $80 \,h^{-1}$ Mpc. We report differences in the marked correlation function between $f(R)$ with $|fR0|= 10^{-6}$ and GR simulations of the order of $3–5\sigma$ in real space. The redshift-space monopole of the marked correlation function in this MG scenario exhibits similar features and performance as the real-space marked correlation function.

        Speaker: Martin Kärcher (University of Milan)
      • 14
        CNN-Enhanced Zel’dovich Reconstruction for BAO analysis in Large-Scale Structure Surveys

        The Baryon Acoustic Oscillations (BAOs) scale in the 2-point galaxy correlation function serves as a standard ruler to trace the expansion history of the Universe and constrain the properties of the Dark Energy, as demonstrated by the recent results of the DESI survey. Precise measurements of the BAO scale rely on nonlinear transformation of the data commonly known as “reconstruction”. The standard approach based on the Zel’dovich approximation can be improved in several ways which, however, are typically computationally demanding and, for this reason, difficult to apply to large datasets.
        I present a novel approach, dubbed Enhanced Zeldovich Reconstruction (EZR) that leverages a Neural Network previously trained on a Zel’dovich-reconstructed density field to improve the quality of the reconstruction. This goal is to improve the matching between the model and the measured BAO peak and to increase the precision of the inferred cosmological parameters.
        I have compared the performance of EZR to that of the standard Zel’dovich reconstruction using a large suite of simulated data. My results indicate that the BAO peak reconstruction performed with EZR is indeed more precise and consequently it provides tighter constraints on the cosmological parameters that determine the cosmic expansion history of the Universe.

        Speaker: Edoardo Maragliano (University of Genoa, INFN)
      • 10:30
        Coffe break
      • 15
        Modelling the distribution of galaxy multi-tracers through cosmic time

        The latest generation of spectroscopic surveys, such as DESI, Euclid or Subaru-PFS, aims to map the large-scale structure of the Universe with unprecedented accuracy by targeting diverse galaxy populations: luminous red galaxies (LRGs), emission-line galaxies (ELGs), and quasars (QSOs). These sources serve as biased multi-tracers of the underlying dark matter field, and their clustering properties provide crucial constraints on the expansion history and growth of cosmic structure.

        In this talk, I will present our latest results on modeling the DESI ELG and LRG clustering and its connection to dark matter halos using the AbacusSummit simulation, coupled with a novel halo occupation framework for galaxy multi-tracers. Our approach incorporates intra-halo dynamics and quenching to reproduce the anisotropic clustering down to 0.1 Mpc/h scales with unprecedented accuracy. By leveraging DESI data and reaching 20 times better resolution than current abundance matching studies, our model provides a robust benchmark for future cosmological analyses. These findings not only refine our understanding of galaxy bias and systematics but also offer new insights into the nature of dark matter and gravity on sub-megaparsec scales.

        Speaker: Ginevra Favole (Instituto de Astrofísica de Canarias)
      • 16
        Indicator Power Spectra

        Indicator functions identify regions of a given density characterizing the density dependence of clustering. I show that indicator-function power spectra are biased versions of the linear spectrum on large scales. A first principle calculation for this bias reproduces simulation results. I provide a simple functional form for the translinear portion of the indicator-function spectra. These spectra facilitate surgical excision of non-linearity and thus significantly increase the reach of linear theory and extract information beyond the traditional liner power spectrum. Indicator spectra also facilitate the calculation of theoretical covariance matrices for counts-in-cells (CIC), facilitating parameter estimation with complementary CIC methods.

        Speaker: Istvan Szapudi (Institute for Astronomy, University of Hawaii)
      • 17
        The clustering of dark matter haloes in alternative cosmological models

        The clustering of large scale structure has been recognised as a fundamental cosmological probe, which offers us the possibility to constrain fundamental parameters, such as the matter density content of the Universe. Currently, the most acknowledged cosmological scenario is the $\Lambda$CDM model, which assumes that dark matter particles exist in a ‘cold' version, namely in the form of very massive candidates, e.g. WIMPs, or condensates of light axions. It is consistent with observations on scales ranging from the size of the cosmological horizon to the typical intergalactic distances. However, some possible tensions, related to the number of satellite galaxies and to the halo density profiles, have been suggested by observations at the typical galactic and sub-galactic scales, of the order of $k$pc. One possible solution is provided by complex baryonic feedback. Another possibility is to explore the dark sector, investigating the macroscopic consequences of alternative cosmological scenarios based on the existence of warm dark matter (WDM) or self-interacting dark matter (SIDM) particles. For this reason, we have used a set of dark matter-only simulations to investigate the clustering properties of halos in both cold and different WDM and SIDM models. In particular, the small-scale clustering of dark matter halos provides a valid way to discriminate between different cosmological scenarios, in preparation for a more detailed study that fully incorporates baryonic effects, and for comparison with observational data from galaxy clustering.

        Speaker: Massimiliano Romanello (Università di Bologna)
      • 18
        De-noising cosmological covariance matrices using Rotational Invariant estimators

        In this talk, I will explore the potential of de-noising large cosmological covariance matrices using analytical techniques from Random Matrix Theory, particularly the class of Rotational Invariant estimators. I will evaluate the performance of this approach using galaxy clustering statistics and I will compare them with non-linear shrinkage methods, highlighting their advantages and limitations.

        Speaker: Antonio Farina (Istituto Nazionale di Astrofisica (INAF))
      • 13:00
        Lunch
      • 19
        Measuring higher-order correlation functions in current and future galaxy surveys
        Speaker: Slepian
      • 20
        Cosmology from Large Scale full-shape analyses combining 2PCF and 3PCF

        In this talk, we present, for the first time, cosmological parameter constraints obtained by jointly analysing the two-point correlation function (2PCF) and the three-point correlation function (3PCF). This work marks the final step of a research programme which, over the past few years, has bridged the gap between configuration space and Fourier space, both in terms of observational measurements and theoretical modelling. We will demonstrate how incorporating the 3PCF alongside the 2PCF enhances the constraining power of full-shape analyses, reaching a level comparable to that of Fourier space studies — all while avoiding the complications associated with window functions.

        Speaker: Massimo Guidi (University of Bologna)
      • 21
        Consistent Clustering Analysis in Configuration and Fourier Space

        Understanding the large-scale distribution of galaxies often relies on two-point statistics—either the configuration-space two-point correlation function or its Fourier-space counterpart, the power spectrum. Although these measures are theoretically equivalent, practical estimates can lead to subtle differences that impacts the cosmological interpretation of the results.

        In this talk, I will present a comparison of clustering statistics derived from both Fourier and configuration spaces within a full-shape analysis framework. I will explore their methodological similarities and differences and discuss the implications for cosmological modeling and parameter estimation.

        Furthermore, I will share results from a joint likelihood analysis that combines both approaches. This analysis leverages a theoretically motivated covariance model and advanced denoising techniques to enhance the robustness and accuracy of the inferred cosmological parameters.

        Speaker: Alfonso Veropalumbo (Istituto Nazionale di Astrofisica (INAF))
      • 16:00
        Coffe break
      • 22
        Toward a full-shape analysis of the galaxy anisotropic 3-point correlation function at the BAO scale

        With the present and upcoming Stage IV spectroscopic surveys, soon we will have data of more than a billion galaxies. This enables us to move beyond traditional two-point statistics to explore higher-order correlations. In particular, the three-point correlation function (3PCF) offers many advantages – it can probe non-Gaussianity, break degeneracies between cosmological parameters, and improve their constraints. Furthermore, a new model of anisotropic 3PCF enables the separation of the signal contributions from redshift-space distortions and the Alcock-Paczyński effect, which is crucial as all the surveys are in redshift space.

        The bottleneck of this promising 3PCF model is its computation costs – matter 3PCF takes 24 CPU hours, and galaxy 3PCF takes more than 100 CPU hours for a single cosmological model. Consequently, it would take several years to constrain cosmological parameters with methods such as Markov chain Monte Carlo (MCMC). Therefore, to speed up the computation, we applied a neural network algorithm to create an emulator, which can compute one cosmological model in less than a second on a laptop.

        The emulator allows variation of the following parameters: matter density $\Omega_m$, scalar amplitude of primordial density fluctuations $A_s$ and dimensionless Hubble parameter $h$. It accelerates the computation of one model by more than 10 million times, while maintaining competitive sub-percent accuracy. It will enable the use of the galaxy anisotropic 3PCF on upcoming datasets from, for example, Euclid or DESI.

        Additionally, we found that including the anisotropic component in the cosmological parameter constraints yields significant improvements over the isotropic component. If the squeezed triangle configurations are included in the analysis ($r_{min}$ = 20 $h^{-1}Mpc$), we obtain an improvement of over 25%. If the squeezed triangles are excluded, the improvement is negligible.

        Speaker: Kristers Nagainis (University of Latvia Institute of Astronomy)
      • 23
        TBD
        Speakers: Michele Ennio Maria Moresco (Istituto Nazionale di Astrofisica (INAF)), Michele Moresco (University of Bologna)
      • 24
        The BAO linear point as cosmic ruler: tests and applications to the Euclid mission

        The large-scale distribution of galaxies contains information about the acoustic waves propagated in the primordial baryon-photon plasma. These waves imprint a characteristic scale in the two point correlation function of the galaxies. This scale, called Linear Point, is defined as the mid-point between the maximum and the minimum of the correlation function at scales of about 150 Mpc.
        In my talk, I discuss why the Linear Point is a cosmic ruler that enables us to measure cosmological distances without the need to model the impact of non-linearities on the correlation function of galaxies.
        Finally, I focus on my current research in the context of the Euclid mission. We are studying the accuracy and the expected precision of the Linear Point measurements on mock catalogs (dark matter particles, halos and galaxies), based on the mission’s characteristics. This work is necessary for preparing to apply the Linear Point to Euclid data.

        Speaker: Angelo Ferrari (INFN - Bologna)
    • 09:00
      Free day / collaboration work
    • III day: Alternative probes
      • 25
        Field Level Inference with Fully Differentiable Hydrodynamical Physics

        Ongoing and upcoming spectroscopic redshift surveys will reach unprecedented depths and number densities, providing a wealth of information about galaxy clustering. However, current perturbative approaches struggle to accurately model small-scale clustering due to uncertainties in galaxy formation and bias modeling. While various astrophysical probes can inform these models, no existing cosmological hydrodynamical simulation can self-consistently capture all relevant processes due to uncertainties in subgrid physics. I will present a new generation of hydrodynamical simulations which allow rapid gradient estimation through both the gas dynamics and subgrid models, facilitating efficient inference of underlying physical parameters and initial conditions. This framework opens new avenues for fully exploiting next-generation survey data at the field level.

        Speaker: Benjamin Horowitz (Kavli IPMU)
      • 26
        Field-level inference and the path towards percent-level constraint on growth of structure from spectroscopic surveys

        Field-level inference (FLI) is a framework to analyze galaxy clustering data from spectroscopic surveys directly at the level of the three-dimensional fields. This approach bypasses data compression, hence preserves the entire cosmological information encoded in the data. In this talk, I'll first present recent results from simulated data showcasing how FLI improves the constraint on growth of structure significantly over the standard power spectrum plus bispectrum (P+B) statistics. I'll then provide further theoretical calculations and resulting insights into the information content can be accessed by FLI vs P+B. Finally, I'll provide updates on recent progress to connect FLI to [real] observational data.

        Speaker: Minh Nguyen (Kavli IPMU)
      • 27
        Benchmarking field-level inference from galaxy surveys and its application to primordial non-Gaussianity analysis

        Field-level inference has emerged as a powerful framework to fully exploit cosmological information from next-generation galaxy surveys. It involves performing Bayesian inference to jointly estimate the cosmological parameters and initial conditions of the cosmic field directly from the observed galaxy density field. However, the computational feasibility of MCMC (Markov Chain Monte Carlo) sampling methods for large-scale field-level inference remains an open question. To address this, we introduce a standardized benchmark using a fast, differentiable galaxy density simulator based on JaxPM. We evaluate various sampling techniques, including Hamiltonian Monte Carlo (HMC), the No-U-Turn Sampler (NUTS), and Microcanonical Langevin Monte Carlo (MCLMC), comparing their efficiency in generating independent samples per model evaluation. Our results demonstrate that careful preconditioning of latent variables is crucial, and that MCLMC outperforms other methods by an order of magnitude in efficiency while maintaining minimal bias in the marginal posterior.
        These methodological advances pave the way for applying field-level cosmological inference to galaxy surveys. Specifically, we explore its application to the analysis of primordial non-Gaussianity (PNG) using DESI data, with a particular focus on a rigorous field-level treatment of imaging systematics.

        Speaker: Hugo Simon-Onfroy (CEA Paris-Saclay)
      • 10:30
        Coffe break
      • 28
        Going beyond the power spectrum: an analysis of BOSS & DESI galaxy clustering using the wavelet scattering transform.

        Optimal extraction of the non-Gaussian information encoded in the Large-Scale Structure (LSS) of the universe lies at the forefront of modern precision cosmology. In this talk, I plan to discuss recent efforts to achieve this task using the Wavelet Scattering Transform (WST), which subjects an input field to a layer of non-linear transformations that are sensitive to non-Gaussianity through a generated set of WST coefficients. In order to assess its applicability in the context of LSS surveys, I will present recent progress towards the application of this technique to DESI galaxy clustering observations, moving beyond a past analysis of the precursory BOSS dataset. I will summarize the latest efforts to assess the robustness of this estimator through an emulator mock challenge within the DESI collaboration, before discussing a series of improvements for its application to the anisotropic redshift-space clustering traced by DESI Year 1/Year 3 spectroscopic observations. Finally, I will show first proof-of-concept results from the application of this novel technique to another key cosmological probe, the Lyman-alpha forest.

        Speaker: Georgios Valogiannis (University of Chicago)
      • 29
        Boosting HI-Galaxy Cross-Clustering Signal through Higher-Order Cross-Correlations

        After reionization, neutral hydrogen (HI) traces the large-scale structure (LSS) of the Universe, enabling HI intensity mapping (IM) to capture the LSS in 3D and constrain key cosmological parameters. We present a new framework utilizing higher-order cross-correlations to study HI clustering around galaxies, tested using real-space data from the IllustrisTNG300 simulation. This approach computes the joint distributions of 𝑘-nearest neighbor (𝑘NN) optical galaxies and the HI brightness temperature field smoothed at relevant scales (the 𝑘NN-field framework), providing sensitivity to all higher-order cross-correlations, unlike two-point statistics. To simulate HI data from actual surveys, we add random thermal noise and apply a simple foreground cleaning model, filtering out Fourier modes of the brightness temperature field with $k_{\parallel} < 𝑘_{\rm min,\parallel}$. Under current levels of thermal noise and foreground cleaning, typical of a Canadian Hydrogen Intensity Mapping Experiment (CHIME)-like survey, the HI-galaxy cross-correlation signal in our simulations, using the 𝑘NN-field framework, is detectable at > 30$\sigma$ across $𝑟 = [3, 12] \,h^{-1} \rm Mpc$. In contrast, the detectability of the standard two-point correlation function (2PCF) over the same scales depends strongly on the foreground filter: a sharp 𝑘 filter can spuriously boost detection to 8$\sigma$ due to position-space ringing, whereas a less sharp filter yields no detection. Nonetheless, we conclude that 𝑘NN-field cross-correlations are robustly detectable across a broad range of foreground filtering and thermal noise conditions, suggesting their potential for enhanced constraining power over 2PCFs.

        Speaker: Eishica Chand (IISER PUNE)
      • 30
        Cosmological constraints with Vornoi Volume Function

        Voronoi Volume Function (VVF) is the distribution of cell volumes in the Voronoi tessellation of a given set of Comological tracers. It encodes information about the full clustering hierarchy of the tracer population and serves as an excellent higher-order statistic for probing non-Gaussianity in the large-scale structure. Despite its sensitivity to cosmological parameters, redshift-space effects, and tracer properties, VVF remains relatively unexplored. In this work, we use a suite of N-body simulations to perform a Fisher forecast using VVF. Our results demonstrate that combining VVF information from multiple tracer populations can yield highly precise constraints on certain cosmological parameters. This work can be further extended by integrating VVF with other statistics, such as the power spectrum, to enhance cosmological constraints. Additionally, VVF presents exciting opportunities for studying galaxy evolution and the nature of dark matter. As observational surveys continue to improve in precision and coverage, incorporating VVF into large-scale structure analyses could provide additional insights and complement traditional statistical methods.

        Speaker: Ms Saee Dhawalikar (Inter University Centre for Astronomy and Astrophysics (IUCAA), Maharashtra India)
      • 31
        Cosmic Voids: Unlocking Novel Statistics Completing Galaxy Clustering

        In recent years, cosmic voids have emerged as powerful tools in cosmology, offering unique insights that complement traditional two-point statistics of galaxy clustering. To fully exploit the potential of voids for the vast datasets from ongoing and upcoming galaxy surveys, robust statistical modeling and careful data reduction are crucial.

        This talk will delve into the application of the three primary void statistics in galaxy surveys: the void size function, the void-galaxy cross-correlation function, and the void auto-correlation function. I will highlight how theoretical modeling and optimized data processing can enhance their constraining power while avoiding biases. Particular focus will be placed on the role of voids within the Euclid and Roman survey frameworks, illustrating their complementarity to traditional analyses in probing dark energy and other key cosmological parameters. These results highlight the potential of combining cosmic voids with galaxy two-point statistics to enhance the constraining power of cosmological analyses in stage IV galaxy surveys.

        Speaker: Giovanni Verza
      • 13:00
        Lunch
      • 32
        Perspective on simulations for future clustering analysis

        In this talk, we explore the trasformative role of cosmological N-body simulations in advancing our understanding of galaxy clustering. As galaxy surveys become increasingly detailed, the demand for accurate modelling intensifies. We delve into how deep learning acceleration and super resolution simulations offer unprecedented precision and minimize computational costs, paving the way for enhanced clustering analysis. These cutting-edge methodologies promise to revolutionize the modelling and covariance production of large-scale cosmic structures for ongoing and upcoming galaxy surveys.

        Speaker: Dr Carmelita Carbone (INAF IASF-MI)
      • 33
        Cosmology in the era of AI agents
        Speaker: Villaescusa
      • 34
        Efficient simulation of galaxy lightcones and associated radiation fields

        I will present a substantial upgrade to the public simulation code, 21cmFASTv4, allowing efficient and flexible forward models of galaxy and IGM observables. In an end-to-end approach, we sample cosmological parameters, creating a 3D realization of the initial conditions. Dark matter halos are identified in Lagrangian space and then moved, together with the matter field, to Eularian space using 2LPT. Galaxies are assigned to DM halos by sampling parametric conditional probability densities motivated by well-established empirical relations such as the stellar to halo relation, star forming main sequence, fundamental metallicity relation etc. Cosmic radiation fields (Lyman-alpha, Lyman-continuum, and X-ray) sourced by these galaxies are calculated using approximate radiative transfer, and the IGM is evolved accordingly. Under fiducial settings, 21cmFASTv4 computes all of these steps in ∼2 core hours for a single realization, thus facilitating high-dimensional, multi-tracer, field-level Bayesian inference of cosmology and astrophysics.

        Speaker: Andrei Albert Mesinger (Istituto Nazionale di Astrofisica (INAF))
      • 16:00
        Coffe break
      • 35
        Cosmological probes with cosmic voids: advanced modelling and the latest results

        The ISW and gravitational lensing signals of cosmic large-scale structures probe the growth of structure in the low-redshift cosmic web. In this talk, I will summarise the recent results on detecting cosmic voids in the galaxy distribution, and their cross-correlations with CMB lensing and temperature maps, using different surveys such as DES, DESI, Euclid, Pan-STARRS, and also Gaia quasars. I will also explain how we are using the Gower St simulations with hundreds of different cosmologies to build a new modelling framework for cosmological constraints using cosmic voids.

        Speaker: András Kovács (Konkoly Observatory)
      • 36
        Extracting cosmological information from the shape of cosmic voids

        The emerging field of cosmic void studies provides a powerful probe for testing cosmological models and the properties of large-scale structure. Among the key statistics in this context is the void shape, which can be characterized by the void-galaxy cross-correlation function. This statistic encapsulates valuable information about the underlying cosmological model and the dynamics of the cosmic web.

        In this talk, I will present the potential of void shape as a cosmological observable, highlighting its ability to constrain fundamental physics. Additionally, I will discuss key techniques to mitigate systematic effects that can affect its measurement, including the application of velocity reconstruction methods. These advancements pave the way for a more robust use of voids in upcoming spectroscopic surveys, offering complementary insights to traditional large-scale structure probes.

        Speaker: Giulia Degni (Università Roma Tre)
      • 37
        Unveiling Cosmic Voids Through Tracer Dynamics: A Novel Approach for Large-Scale Structure Analyses

        With the rise of wide-area galaxy surveys, cosmic voids have emerged as powerful probes of the large-scale structure of the Universe, both as stand-alone observables and as complementary to galaxy clustering statistics. However, cosmological analyses of cosmic voids are subject to various observational systematics, exacerbated by their characteristic size and underdense environment.

        To mitigate these systematics and enhance the cosmological constraining power of cosmic voids, I present a novel dynamical void-finding algorithm, the Back-in-Time Void Finder, specifically designed for precision cosmology. This method reconstructs the tracer velocity field to identify cosmic voids as points of maximum divergence in the displacement field, effectively pinpointing regions from which the largest mass outflows originate. Optimized for large-scale surveys, the algorithm produces catalogues of high-purity voids, tailored for various cosmological applications such as the void-galaxy cross-correlation function, void size function, and velocity profile analyses.

        Speaker: Simone Sartori (CPPM/CNRS (Marseille))
    • IV day: Simulations and Machine Learning for LSS
      • 38
        Constraining Cosmological Parameters with Machine Learning: Application to eROSITA galaxy clusters

        Given the X-ray observations of galaxy clusters by eROSITA and the multi-cosmology simulations, one can compare their outcomes and constrain the cosmological parameters inversely via machine learning. The key point lies in understanding the simulations and observations from a probabilistic perspective. We are first to match the individual observed eROSITA galaxy cluster to the multi-hydro-cosmology simulations, aiming to constrain the cosmology parameters.

        Speaker: Fucheng Zhong (Istituto Nazionale di Astrofisica (INAF))
      • 39
        TBD
        Speaker: Metcalf
      • 40
        Profile likelihoods for the neutrino mass, using latest cosmological datasets

        We derive constraints on the neutrino mass using a frequentist approach based on likelihood profiles. Our analysis leverages the latest cosmological datasets, including DESI DR2 BAO and DR1 full-shape likelihoods, CMB and CMB lensing from Planck and ACT, recent Lyman-alpha 1D power spectrum emulation applied to eBOSS, and supernovae data.
        Profile likelihoods offer several advantages when working on the neutrino mass. First, as current constraints are getting closer and closer to the ∑m = 0 limit, profile likelihoods interrupted by this limit can easily be extrapolated to infer the statistical power of the data and compare data combinations. Secondly, they are not subject to any of the prior effects that might arise in Bayesian methods for inference.
        Our analysis features exciting advancements such as the full-shape likelihood from DESI DR1, which enable precise measurements of small-scale suppression independently of CMB lensing. This provides a valuable point of comparison and offers a fully LSS-based measurement when combined with the Lyman-alpha forest data.
        Finally, we discuss the constraints on the minimal sums of neutrino mass that arise from mass ordering and oscillation measurements of solar and atmospheric neutrinos.

        Speaker: Domitille Chebat (CEA Saclay - IRFU - DPhP)
      • 10:30
        Coffee break
      • 41
        Summary and discussion
      • 42
        End of the conference