Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
227
result(s) for
"Petrillo, G."
Sort by:
LArSoft: toolkit for simulation, reconstruction and analysis of liquid argon TPC neutrino detectors
2017
LArSoft is a set of detector-independent software tools for the simulation, reconstruction and analysis of data from liquid argon (LAr) neutrino experiments The common features of LAr time projection chambers (TPCs) enable sharing of algorithm code across detectors of very different size and configuration. LArSoft is currently used in production simulation and reconstruction by the ArgoNeuT, DUNE, LArlAT, MicroBooNE, and SBND experiments. The software suite offers a wide selection of algorithms and utilities, including those for associated photo-detectors and the handling of auxiliary detectors outside the TPCs. Available algorithms cover the full range of simulation and reconstruction, from raw waveforms to high-level reconstructed objects, event topologies and classification. The common code within LArSoft is contributed by adopting experiments, which also provide detector-specific geometry descriptions, and code for the treatment of electronic signals. LArSoft is also a collaboration of experiments, Fermilab and associated software projects which cooperate in setting requirements, priorities, and schedules. In this talk, we outline the general architecture of the software and the interaction with external libraries and detector-specific code. We also describe the dynamics of LArSoft software development between the contributing experiments, the projects supporting the software infrastructure LArSoft relies on, and the core LArSoft support project.
Journal Article
ICARUS at the Fermilab Short-Baseline Neutrino program: initial operation
2023
The ICARUS collaboration employed the 760-ton T600 detector in a successful 3-year physics run at the underground LNGS laboratory, performing a sensitive search for LSND-like anomalous
ν
e
appearance in the CERN Neutrino to Gran Sasso beam, which contributed to the constraints on the allowed neutrino oscillation parameters to a narrow region around 1 eV
2
. After a significant overhaul at CERN, the T600 detector has been installed at Fermilab. In 2020 the cryogenic commissioning began with detector cool down, liquid argon filling and recirculation. ICARUS then started its operations collecting the first neutrino events from the booster neutrino beam (BNB) and the Neutrinos at the Main Injector (NuMI) beam off-axis, which were used to test the ICARUS event selection, reconstruction and analysis algorithms. ICARUS successfully completed its commissioning phase in June 2022. The first goal of the ICARUS data taking will be a study to either confirm or refute the claim by Neutrino-4 short-baseline reactor experiment. ICARUS will also perform measurement of neutrino cross sections with the NuMI beam and several Beyond Standard Model searches. After the first year of operations, ICARUS will search for evidence of sterile neutrinos jointly with the Short-Baseline Near Detector, within the Short-Baseline Neutrino program. In this paper, the main activities carried out during the overhauling and installation phases are highlighted. Preliminary technical results from the ICARUS commissioning data with the BNB and NuMI beams are presented both in terms of performance of all ICARUS subsystems and of capability to select and reconstruct neutrino events.
Journal Article
Forecasting of the first hour aftershocks by means of the perceived magnitude
2019
The majority of strong earthquakes takes place a few hours after a mainshock, promoting the interest for a real time post-seismic forecasting, which is, however, very inefficient because of the incompleteness of available catalogs. Here we present a novel method that uses, as only information, the ground velocity recorded during the first 30 min after the mainshock and does not require that signals are transferred and elaborated by operational units. The method considers the logarithm of the mainshock ground velocity, its peak value defined as the perceived magnitude and the subsequent temporal decay. We conduct a forecast test on the nine
M
≥ 6 mainshocks that have occurred since 2013 in the Aegean area. We are able to forecast the number of aftershocks recorded during the first 3 days after each mainshock with an accuracy smaller than 18% in all cases but one with an accuracy of 36%.
The timing and locations of aftershocks following the initial impact of an earthquake are key to mitigate potential further hazards. Here the authors use the seismic ground velocity as input parameter to provide accurate probabilities of post seismic occurrence within 30 min of the main shock.
Journal Article
Estimating the Completeness Magnitude mc and the b‐Values in a Snap
2023
A good estimation of the b‐value is crucial for the earthquake hazard assessment. Its evaluation can be strongly affected by an incorrect estimation of the completeness magnitude mc because a too small mc will reflect into a small b‐value, whereas a too large mc will imply a larger standard deviation due to the reduction of the magnitude interval. Several methods for the estimation of mc exist, however its evaluation is very delicate and requires some critical decision making in most cases. Here we present a new, very rapid and simple method for mc estimation. It is based on the observation that the Gutenberg‐Richter distribution is an exponential one only for magnitudes larger than mc. As a consequence, the average magnitude ma value should increase linearly with a threshold magnitude mth. The departures from such linear behavior, allows a correct estimation of mc, whereas the linearity of the of ma versus mth allows a correct estimation of the b‐value. Plain Language Summary A very simple and rapid method for the estimation of the completeness magnitude and of the b‐value is here introduced. The method is based on the evaluation of the average magnitude as a function of a threshold one. The method does not require any decision and can be easily implemented in automatic procedures. Key Points The b and mc parameters are rapidly estimated The two parameters are estimated independently
Journal Article
Automatic Earthquake Declustering Using the Nearest‐Neighbor Distance
2026
In the widely adopted description of seismic occurrence, earthquakes are categorized as either background or triggered events. In this work, we present a fully automated, non‐parametric algorithm for distinguishing between these two categories, a process known as seismic declustering, based on the widely used nearest‐neighbor (NN) metric. We introduce a new measure, the susceptibility index, which identifies an optimal threshold to discriminate between background and triggered events within the NN metric. Through statistical testing on simulated epidemic type aftershock sequence catalogs, we demonstrate that our method yields classification metrics exceeding 90%, outperforming state‐of‐the art algorithms. Notably, we show that a single threshold is sufficient for reliable discrimination within a given data set. The identification of this threshold requires memory capacity and computational time that scale linearly and quadratically with the data set size, respectively, making the method particurarly suited for large earthquake catalogs. We also apply our method to the relocated Southern California catalog and the GeoNet catalog of New Zealand (NZ). Our method effectively adapts across the different tectonic settings, capturing the variability of background seismicity rates between the shallow crustal events of Southern California and the tectonically diverse seismicity of NZ. Plain Language Summary Earthquakes are typically categorized into two types: background events, which occur randomly in time, and triggered events, which are caused by earlier earthquakes. Separating these two types is important for estimating the expected number of earthquakes in the future decades. In this study, we introduce a new method that automatically separates background earthquakes from triggered ones using a metric that combines time, space and magnitude. By testing this method on simulated earthquake data sets, we show that it can classify earthquakes with over 90% accuracy. Our approach provides an optimal threshold to make this distinction and is computationally scalable, making it suitable for use on large earthquake data sets. We applied our method to real earthquake data from Southern California and New Zealand, and obtained reasonable estimates of the background rate. Key Points A unique threshold is sufficient for discriminating the background from triggered components of the nearest‐neighbor distribution A new automatic, non‐parametric algorithm is proposed to estimate the number of background events The proposed algorithm effectively separates background from triggered events in both simulated and real earthquake catalogs
Journal Article
The b$b$ ‐Value Tomography of the Calabrian Arc
2025
In the Calabrian Arc subduction zone, the notable lack of seismicity at depths near 100 km strongly suggests the presence of slab detachment. Contrary to typical patterns, where b$b$ ‐values decrease with depth, our b‐value mapping reveals unexpectedly high b$b$ ‐values at these depths. Within the 100–150 km depth interval, the gradient of the b$b$ ‐value reaches its peak, indicating a significant reduction in stress. We propose four potential interpretations for these observations: (a) fluid‐induced weakening due to dehydration processes, (b) heterogeneity at the slab tip reducing rupture propagation, (c) creeping zone behavior at the detachment tip, and (d) post‐detachment damage to the rocks, leaving them unable to support stress. These hypotheses remain beyond experimental verification at present. This study underscores the complex interplay of geological processes at depth and their implications for seismic hazard assessment in subduction zones. Plain Language Summary We present a three dimensional map of the b$b$ ‐value for the Calabrian Arc (Italy). We find that the b$b$ ‐value assumes anomalous high values beneath a zone where there are no recorded earthquakes. This is a strong indication that, there, the stress is smaller. We then provide 4 different physical interpretations. Key Points Multiple physical explanations for the high b$b$ ‐values observed, including fluid‐induced weakening and slab heterogeneity Identifying unusually high b$b$ ‐values beneath the slab detachment area, suggesting lower stress levels at those depths b$b$ ‐more‐positive‐value estimation offers greater reliability in identifying stress variations in subduction zones
Journal Article
An Analytic Expression for the Volcanic Seismic Swarms Occurrence Rate. A Case Study of Some Volcanoes in the World
2023
Seismic swarms are defined as a group of earthquakes occurring very close in time and space but without any distinctively large event triggering their occurrence. Up to now no simple law has been found to describe the swarm occurrence rate. Here we find an expression able to fit the average occurrence rate on some volcanic areas. This expression exhibits some differences in respect to the classical Omori law. Namely the c parameter of the Omori law is equal to zero and the power law decay of the average occurrence rate of the earthquakes is followed by an exponential decaying regime. Both the results can be interpreted in term of fluid injection and/or movements. Indeed this is a more impulsive phenomenon compared to the occurrence of a large earthquake, with a duration compatible with a c = 0. The exponential decay following the power law one could be explained by a viscoelastic relaxation of the stress induced by the injection and/or movements of fluids in the earth crust. Key Points We find an analytical expression of volcanic earthquakes occurrence rate We analyze some volcanic seismic catalogs We found some differences respect the traditional Omori law
Journal Article
Estimating the Completeness Magnitude m c and the b ‐Values in a Snap
by
Petrillo, G.
,
Godano, C.
2023
A good estimation of the b ‐value is crucial for the earthquake hazard assessment. Its evaluation can be strongly affected by an incorrect estimation of the completeness magnitude m c because a too small m c will reflect into a small b ‐value, whereas a too large m c will imply a larger standard deviation due to the reduction of the magnitude interval. Several methods for the estimation of m c exist, however its evaluation is very delicate and requires some critical decision making in most cases. Here we present a new, very rapid and simple method for m c estimation. It is based on the observation that the Gutenberg‐Richter distribution is an exponential one only for magnitudes larger than m c . As a consequence, the average magnitude m a value should increase linearly with a threshold magnitude m th . The departures from such linear behavior, allows a correct estimation of m c , whereas the linearity of the of m a versus m th allows a correct estimation of the b ‐value. A very simple and rapid method for the estimation of the completeness magnitude and of the b ‐value is here introduced. The method is based on the evaluation of the average magnitude as a function of a threshold one. The method does not require any decision and can be easily implemented in automatic procedures. The b and m c parameters are rapidly estimated The two parameters are estimated independently
Journal Article
Long-baseline neutrino oscillation physics potential of the DUNE experiment
by
Petrillo, G.
,
Calvez, S.
,
Coan, T. E.
in
Astronomy
,
Astrophysics and Cosmology
,
Elementary Particles
2020
The sensitivity of the Deep Underground Neutrino Experiment (DUNE) to neutrino oscillation is determined, based on a full simulation, reconstruction, and event selection of the far detector and a full simulation and parameterized analysis of the near detector. Detailed uncertainties due to the flux prediction, neutrino interaction model, and detector effects are included. DUNE will resolve the neutrino mass ordering to a precision of 5
σ
, for all
δ
CP
values, after 2 years of running with the nominal detector design and beam configuration. It has the potential to observe charge-parity violation in the neutrino sector to a precision of 3
σ
(5
σ
) after an exposure of 5 (10) years, for 50% of all
δ
CP
values. It will also make precise measurements of other parameters governing long-baseline neutrino oscillation, and after an exposure of 15 years will achieve a similar sensitivity to
sin
2
2
θ
13
to current reactor experiments.
Journal Article