The EUCLID project, which is part of the M-class ESA Cosmic Vision program, aims to map the geometry of the dark universe through two independent probes: weak gravitational lensing and galaxy clustering. It will conduct a survey of the extragalactic sky in the visible and near-infrared bands, resulting in the generation of up to 30 petabytes of data. The architecture of the Euclid Science...
As the Gaia mission approaches the end of its scientific life (June 2025) after 11 years of continuous in-orbit operations, this presentation addresses the work that Team INAF has started at our Data Processing facility (DPCT@ALTEC) to face the challenges of the mission legacy, as advocated and prototyped in the framework of The Living Sky Project. We focus on the computational aspects implied...
We ported to the GPU with CUDA the Astrometric Verification Unit–Global Sphere Reconstruction (AVU–GSR) Parallel Solver developed for the ESA Gaia mission, by optimizing a previous OpenACC porting of this application. The code aims to find, with a [10,100] μarcsec precision, the astrometric parameters of ∼10^8 stars, the attitude and instrumental settings of the Gaia satellite, and the global...
Fostered by upcoming data from new generation observational campaigns, we are about to enter a new era for the study of how galaxies form and evolve.
The unprecedented quantity of data that will be collected, from distances only marginally grasped up to now, will require analysis tools designed to target the specific physical peculiarities of the observed sources and handle extremely...
Binary population-synthesis codes are crucial tools to study the evolution of massive binaries and their impact for the demography of Black Holes binaries.
I present a new fully revised version of the population synthesis code SEVN (Stellar EVolution for N-body codes, Spera et al. 2019).
The stellar evolution is computed interpolating from look-up tables containing a grid of stellar...
The OPS4 has been designed by a joint INAF-ALTEC team, as a legacy Big Data facility coming from the experience of the TLS prototype and the Gaia data reduction.
This is accomplished by taking into account the scalability and performance requirements necessary for the analysis and exploitation of Big Data dedicated to the investigation of the nearby Universe in the context of multimessenger...
Thanks to the continuous expansion and improvement of the observational datasets on circumstellar disks, exoplanets and our own Solar System, planet formation and astrochemistry are becoming increasingly interconnected fields of study. The growth and migration processes that shape planet formation in the native circumstellar disks create a bidirectional link between planetary and disk...
In my talk I will overview the current landscape of large simulations for the formation of cosmic structures. After overviewing the role that modern simulations have in the field of HPC, I will highlight their growing relevance as a theoretical framework for the interpretation of observational data, and as a support to ongoing and future observational facilities in astrophysics and cosmology....
The PLUTO code, developed at the University of Torino in collaboration with the Osservatorio Astrofisico di Torino, is one of the most widely used public codes for
astrophysical fluid-dynamics and magnetohydrodynamics, both in the classical and relativistic regimes. The code is designed with a modular and flexible structure whereby different numerical algorithms can be separately combined to...
In this talk, I will outline the activities in the field of numerical astrophysics at INAF-OAPa and the role of the local Sistema Computazionale per l'Astrofisica Numerioca (SCAN). The availability of local computational resources fostered the realization of serveral astrophysics projects, representing a springboard for large computational programs which used massive international HPC systems....
"Recent studies showed an interesting mapping of the 6-dimensional+1 (6D + 1) collisionless fluid (Vlasov-Poisson)
problem into a more amenable 3D + 1 non-linear Schr ̈odinger-Poisson (SP) problem for simulating
the evolution of DM perturbations. This opens up the possibility of improving the scaling of time
propagation simulations using quantum computing. We propose a rigorous formulation...
One of the major challenges in the context of the Cosmic Microwave Background (CMB) radiation is to detect a polarization pattern, the so called B-modes of CMB polarization, that are thought to be directly linked to the quantum tensor fluctuations produced in the Universe during the inflationary phase. To date, several challenges have prevented to detect the B-modes partly because of the lower...
Modern astronomy and astrophysics produce massively large data volumes (in the order of petabytes) coming from observations or simulation codes executed on high performance supercomputers. Such data volumes pose significant challenges for storage, access and data analysis. Visual exploration of big datasets poses some critical challenges that must drive the development of a new generation of...
Cosmological hydrodynamic simulations are unique and successful tools for investigating the evolution of galaxies in a cosmological context. The introduction of BHs and their feedback into simulations - mandatory to model the intertwined growth of BHs and host galaxies - runs up against numerical limitations that are circumvented with ad-hoc sub-resolution techniques.
However, the accurate...
Many scientific codes or data-focused algorithms are being quickly rendered obsolete.
The gargantuan size of the problems that we face is the main responsible for that.
On one hand, the data sets coming from both ground- and space-based observations are already much larger than in the recent past and will still grow by at least an order of magnitude. On the other hand, the target...
Imaging Air Cherenkov Telescopes (IACTs) play a crucial role in the field of ultra-high energy astrophysics (E > 10GeV), bringing a fundamental contribution to the study of cosmic gamma-ray sources. As the sensitivity and complexity of IACT systems increase, so does the demand for efficient data reduction techniques and computational resources to handle the ever-growing data volumes. This...
Pulsar Wind Nebulae are powered by the relativistic, magnetized and cold wind emanating from a rapidly rotating neutron star (the pulsar) that interacts with the ambient medium. They are visible as bright non-thermal sources at a very broad range of energies, from radio to gamma-rays, with a variety of different morphologies.
Pulsar Wind Nebulae are perfect places where to look at for...
I will introduce the DEMNUni simulation set, which accounts for different cosmologies with massive neutrinos and dynamical dark-energy. I will briefly present the scientific results obtained up to now. Then, I will focus on the computational resources (both ISCRA and MoU CINECA-INAF) needed to produce the DEMNUni suite, as well as the long-term storage required to support and mantain the...
We hereby present the computational requirements of the infrastructure needed by the Monitoring System (MON) of the Cherenkov Telescope Array (CTA) in two different scenarios: the performance tests and the on-site deployment. The CTA will be composed of hundreds of telescopes working together to attempt to unveil some fundamental physics of the high-energy Universe. Along with the scientific...
The solar corona shows inexplicably high temperatures, up to million degrees, when compared with the cold lower photosphere. The conversion of magnetic energy into thermal one through the magnetic reconnection has been chased for the last decades as the mechanism to explain this phenomenon. The reason why such mechanism remains elusive is because reconnection events are singularly too small...
The solar corona consists of plasma confined by, and interacting with, the coronal magnetic field. The magnetic processes are highly dynamic and non linear, and their description requires time-dependent magnetohydrodynamic modelling on high performance computing systems.
Large-scale energy release in the corona may involve MHD instabilities such as the kink instability in a single twisted...
Pulsar Wind Nebulae (PWNe) constitute a magnificent lab to investigate high-energy astrophysics in its many facets, from non-thermal emission, to particle acceleration, from relativistic fluid dynamics to anti-matter creation. The group in Arcetri has always been one of the leading team in the study of PWNe, and through the years has developed a vast and advanced suits of numerical tools for...
In numerical experiments of the propagation of relativistic jets produced e.g. by Supermassive Black Holes (SMBH) the covariant equation of state determines a relationship between density, pressure and temperature. While the former is proportional to the Lorentz factor and the second is invariant, one is left with different possibilities concerning temperature. The usual choice adopted in...
Core-collapse supernova remnants (SNRs) exhibit intricate morphologies and a highly non-uniform distribution of stellar debris. In the case of young remnants (less than 5000 years old), their characteristics offer insights into the inner processes of the supernova (SN) engine, including nucleosynthetic yields and large-scale asymmetries originating from the early stages of the explosion....
Magnetic fields manifest themselves almost everywhere in the Universe. Their effects are visible through different kinds of electromagnetic radiation and in the spectra of cosmic rays. In our talk, we focus on the magnetic field in supernova remnants. We show how high-performance computing allows us to investigate the evolution of magnetic field in the remnants of different types of supernova...
The SKA precursor communities are currently developing new software to automate the processing of radio images for various tasks, including source extraction, object or morphology classification, and anomaly detection. These developments heavily rely on HPC processing paradigms and machine learning (ML) methodologies.
In this context, we are developing several tools to support the...
In recent years, the remarkable capabilities offered by precursors and pathfinders of the Square Kilometre Array (SKA) have started revolutionizing our view even of previously well-known objects such as jetted Active Galactic Nuclei (AGN). Particularly in the MHz-frequency regime, observations are now able to uncover the oldest plasma injected by AGN jets into their surrounding environment,...
The effective exploitation of modern architecture is a key factor to achieve best performances in terms of both energy efficiency and run-time reduction.
We bring a specific example of this, by discussing the W-stacking gridder, an algorithm that tackles Radio imaging in massively parallel systems; its performance is limited by an all-to-all data reduction needed to pass from time-domain...
SKA precursors are giving us a first glimpse of the future capabilities of SKA. Designed to be the most sensitive radio telescopes ever, the precursors are planned to release large area surveys with arcsec resolution. However, the final image product is heavily influenced by the data reduction. Not only the huge data quantity makes a careful visual inspection and manual reduction of the data...
Radio astronomy is evolving toward ever larger and more accurate datasets.
As soon as the SKA telescopes are fully operational, hundreds of petabytes of data will be produced each year with unprecedented resolution and detail.
This rapid evolution drives toward the development of Big Data analysis and visualization tools and services, which will necessarily need to be supported by suitable...
Scientific computing in astrophysics must certainly face challenges related to the implementation and optimization of algorithms, data analysis pipelines, and the management of computing resources. On the other hand using huge computational resources may imply enormous needs of storage space, temporarily saved on scratch storage. To reach optimal performance, scratch storage is not intended to...
We hereby present the computational requirements of the infrastructure needed by the Monitoring System (MON) of the Cherenkov Telescope Array (CTA) in two different scenarios: the performance tests and the on-site deployment. The CTA will be composed of hundreds of telescopes working together to attempt to unveil some fundamental physics of the high-energy Universe. Along with the scientific...
The launch of the Legacy Survey of Space and Time (LSST) is going to open the possibility to perform statistical studies on the physical properties of Supernova transients. These studies need sufficiently fast and accurate procedures for the characterization of these events. In this framework, we have developed a Bayesian Analytic Modeling procedure, which is able to simulate Supernovae Light...