Speaker
Description
Scientific computing in astrophysics must certainly face challenges related to the implementation and optimization of algorithms, data analysis pipelines, and the management of computing resources. On the other hand using huge computational resources may imply enormous needs of storage space, temporarily saved on scratch storage. To reach optimal performance, scratch storage is not intended to be highly available and persistent, so a slower but much more robust storage area is needed to preserve data in long term repositories or even in structured archives. This is no less important issue we have to deal with when talking about critical computing, mainly in the context of FAIR Principles and Open Data. As a matter of fact, appropriate data management plans must be taken into account as well. Here, we want to show the current hardware resources, the architecture and all the services related to Big Data challenges available to the community and the future perspectives for supporting critical (and less critical) computational topics in the INAF.