Speaker
Description
The ability to compress observational data and accurately estimate physical parameters relies heavily on informative summary statistics. In this paper, we introduce the use of mutual information (MI) as a means of evaluating the quality of summary statistics in inference tasks. MI can assess the sufficiency of summaries, and provide a quantitative basis for comparison. We propose to estimate MI using the Barber-Agakov lower bound and normalizing flow based variational distributions. To demonstrate the effectiveness of our approach, we conduct a comparative analysis of three summary statistics: the power spectrum, bispectrum, and scattering transform. Our comparison is performed within the context of inferring physical parameters from simulated CMB maps and highly non-Gaussian 21cm mock observations. Our results highlight the ability of our approach to correctly assess the informativeness of different summary statistics, enabling the selection of an optimal set of statistics for inference tasks.