Speaker
Description
The intersection of Quantum Computing and Machine Learning, known as Quantum Machine Learning (QML), presents significant potential for advancing data-intensive fields like astrophysics. Astrophysics increasingly relies on Deep Learning (DL) for handling vast datasets generated by ground-based and satellite experiments, though the potential of Quantum DL remains underexplored. This study evaluates Quantum Convolutional Neural Networks (QCNNs) for detecting Transient Gamma-Ray Bursts (GRBs), using data from Cherenkov Telescope Array (CTA) simulations. GRBs, classified into short and long-duration types, are critical astrophysical events requiring prompt detection.
Building on previous work, we applied QCNNs to GRB data, comparing their performance with classical Convolutional Neural Networks (CNNs). Our findings indicate that QCNNs achieve comparable accuracy, often exceeding 90%, with fewer parameters than classical CNNs, suggesting potential efficiency gains. However, QCNNs require longer training times due to current limitations in quantum computing technology.
We conducted comprehensive benchmarks, examining the impact of hyperparameters such as the number of qubits and encoding methods, including Data Reuploading. While increasing qubits and using sophisticated encodings generally improved performance, it also increased complexity and training time. QCNNs demonstrated robust performance on both time series and sky map datasets, maintaining high accuracy with fewer parameters.
This study, by exploring the application of QCNNs in astrophysics, highlighting their potential and limitations. While QCNNs show promise, especially in parameter efficiency, further optimization of quantum hardware and software is necessary for real-time applications. Our work lays the groundwork for future research, offering insights that could lead to significant advancements in the application of quantum neural networks in astrophysical data analysis.