Speaker
Description
This work presents a Deep Neural Network (DNN) approach for the detection of GRBs notified by external instruments in the AGILE-GRID energy range, between 100 MeV and 10 GeV, where time and position of the GRB is known in advance. Taking into account the complex observation pattern of AGILE, we developed a Convolutional Neural Network (CNN), a class of DNN mainly used for image classification, and we trained it with datasets of simulated AGILE-GRID maps. Half of these maps were simulated with background-only, and the other half with background and a GRB. The GRB model has been derived from the first Fermi-GBM catalogue, convolved with the AGILE-GRID exposure variation. After the CNN training phase, we tested the flexibility of this trained CNN with different observing models and conditions (based on the integrated exposure value and value of the background). For each condition, we provide a p-value distribution, used to define the thresholds of the CNN classification values for the different statistical significance level. The CNN has been applied to Fermi-GBM, Fermi-LAT and Swift-BAT catalogues: more than 20 AGILE-GRID counterparts above three sigma level are found. We also compared results obtained with the Li&Ma traditional method; the trained CNN is more flexible because the analysis is not constrained by the background time window selection.