Inaf it iasfbo pdf free download
The possibility to optimize at the same time source flux and position has simplified the development of the pipeline. The same tool is used also in the manual verification procedure see Section 4. More specifically, the null hypothesis can be formulated as an ensemble of models that keep the flux of the flaring source fixed to zero and the fluxes of steady sources fixed to their known fluxes if any.
For the alternative hypothesis a flaring source is present , the flux and position of this source are allowed to be free, and the fluxes of steady sources are fixed to their known fluxes. As noted in Bulgarelli et al. This instrumental charged particle changes over time and space and for these reasons they are kept free during the data analysis. A Gaussian smoothing of the map is performed with a typical kernel of three bins.
The connected component search procedure starts from these normalized and discretized maps. The search for connected component regions is an iterative procedure. The procedure starts by considering only the image that contains the pixels with value N and then by calculating the connected component regions contained into this first image. The effect of merging two levels is that the original regions grow by adding the pixels of the neighboring level.
This growing procedure stops when more than M connected regions are found where M is a parameter, a typical value is 8. At the end, for each connected region the barycenter is calculated. This is the starting position for the MLE see an example of the found connected regions in the first panel of Figure 4.
In the first substep, all sources included in the initial ensemble of models ordered using the intensity value of the bins contained in the connected region and above a predefined exposure level threshold are analyzed; the flux of the candidate flares is allowed to vary and the position is kept fixed at the value of the center of the connected region. This step is useful to reduce the number of candidates for the final evaluation, which minimizes the complexity of the model. The flux and position of the candidate flares are allowed to vary, and the spectrum of each candidate is assumed to be a power law with the spectral index kept fixed to 2.
In the end, we obtain a list of candidate transient sources and their pretrial statistical significance. The SMS and the App notifications contain a reduced version of the e-mail content. These notifications only include the significance level, the flux and related error, the exposure level, a possible association, and the list of analysis parameters.
The Final Verification Procedure For the most interesting automatic detections, a final verification is performed by human in- tervention. We can perform a check before the refined version of the scientific data is available. Both pipelines work with the common goal of producing scientific results in the shortest possible time and with the best data quality. The significance threshold might occasionally be lower if there is independent evidence of simultaneous activity from a reliable counterpart source at other wavelengths.
Quick Look of Sky Maps When an automatic candidate alert is received, the first step performed during the monitoring activity is the quick look of the data products e. The App for mobile devices is deeply integrated into a scientific ground segment and it is used for daily scientific activities. When an automatic candidate alert is received through the notification system, the private section of the App can be used to access the AGILE-GRID maps to check the automated results or the quality of the data.
Final Analysis To confirm or to improve the automated analysis results before the publication of an ATel, an analysis by human intervention is performed for a subclass of candidate transient events following the same data analysis strategy as reported in Section 3.
For the known sources, we use fixed positions and fluxes in the likelihood analysis. We then add to this ensemble of known sources a point-like source representing the transient candidate initially using the position as determined by the IASFBO SAS pipeline.
We then perform the likelihood analysis, initially setting the position and flux of this transient candidate free. In addition, we first estimate the ggal and giso parameters with a longer timescale integration typically 15 days of integration. We then fix them for the short timescale analysis, assuming that these parameters do not vary significantly on timescales on orders of hours-days. Typically, the integration time we use for this final analysis is the same as for the automatic analysis one to two days.
This procedure implies that the 15 maps generated every day are not independent from one another, and our statistical analysis has to take this fact into account. This is a very different approach than the one taken by Bulgarelli et al. In the end, final results are quite similar but the hypothesis of Bulgarelli et al.
For the same reason, the evaluation of the posttrial probability of Bulgarelli et al. This procedure implies a sliding window offset by one orbit.
Statistical Significance of the Blind Search Procedure for Unknown Transient Sources First we address the blind search procedure: the flaring source is unknown, and the position and flux parameters of each candidate are allowed to be free and optimized with respect to the input data. The starting l, b position is determined with the already described method called spotfinder.
The determination of the likelihood ratio distribution in the case of the null hypothesis is used to evaluate the occurrence of false detections. To evaluate the p-value of the sliding window approach, we performed simulations of empty Galactic regions i.
The simulated observations were generated by adding Poisson-distributed counts in each pixel while considering the exposure level, the Galactic diffuse emission model, and the isotropic diffuse intensity. Each two-day generated maps counts, exposure, and Galactic emission maps have been analyzed using the same procedure as the on- orbit data. During the analysis the spectra of all sources in the field are kept fixed.
We then calculate the p-value distribution, analyzing at the same time one, three and eight candidate flares as upper limits in an empty field, with the flux and position of each source allowed to vary. Table 1 reports the performed simulation and related parameters. As already stated, similar results have been obtained by the analysis of an empty Galactic region with no sliding window Bulgarelli et al. Similar results have been obtained for eight sources in the ensemble of models.
Blazar microvariability at hard X-rays. Foschini1 , M. Gliozzi2 , E. Pian3 , G. Tagliaferri4 , F. Tavecchio4 , V. Bianchin1 , L. Maraschi4 , R. Sambruna5 , G. Di Cocco1 , G. Ghisellini4 , G. Malaguti1 , G. Tosti6 arXiv Blazars are known to display strong and erratic variability at almost all the wave- lengths of electromagnetic spectrum.
Presently, variability studies at high-energies hard X-rays, gamma-rays are hampered by low sensitivity of the instruments.
Some specific cases recently observed are presented and physical implications are discussed e. NRAO Key words. Introduction is still large. An incomplete list of topics to be addressed in this research field, with particu- According to the common paradigm, blazars lar reference to the variability, should include: are powered by a supermassive black hole with i how jets are generated, collimated, and ac- relativistic jets extending from the centre to the celerated; ii what is their composition elec- outer space.
Relativistic effects play a domi- trons, positrons, protons, However, the gap between this generally accepted picture and a detailed physical model To investigate these questions and — from a more general point of view — the nature of Send offprint requests to: L. Presently, with a large collecting area and long-period or- this is not possible, particularly at high en- bit 48 hours make XMM-Newton one of the ergies, because of technological limits.
The IW provides configuration and controller GUI to control and monitor the operations, an on- line operator GUI to display in near real time the acquired data, an off-line operator GUI to retrieve and display archived data. The CIWS-FW software aids the IW developer with basic software components and tools that can be extended in order i to model the data, ii to acquire and store the source data L0 in raw format, iii to generate the transformed data L1 according to the user data model, iv to archive and retrieve them according to the meta-data specified by the user, iv to implement the operators GUI.
The configuration and controller GUI is included in the framework as a built-in component ready to configure and control the IW software. Conforti et Al. The highlighted gray components have been added in or- der to improve the configurability and extensibility attributes of the system and to offer new capabilities.
The Command component handles the commands that change the state of the IW system. In the packet stream case, the developer imple- ments one Processor component for each input data type. The transformation is performed according to the L1 DDL model defined by the developer. Specific API are provided by the DAS component to archive the L1 profiled data meta-data are also associated with the file in a database.
In the file stream case it is assumed a command-oriented connection which is served by MCS Server. The MCSLib component provides to the developer the mechanisms to define the action to be performed for each command. The quick look QL component is able to monitor, annotate, and display the L0 and L1 data, either in near real time or off-line. The ReferenceCatalogue component al- lows the QL component to access custom astronomical source catalogues in order to accomplish tasks like producing a simulated field or cross-matching detected sources.
0コメント