Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
14
result(s) for
"Rubio-Montero, Antonio Juan"
Sort by:
Enhanced Particle Classification in Water Cherenkov Detectors Using Machine Learning: Modeling and Validation with Monte Carlo Simulation Datasets
by
Sidelnik, Ivan
,
Dasso, Sergio
,
Molina, Maria Graciela
in
Algorithms
,
Artificial intelligence
,
astroparticle detectors
2024
The Latin American Giant Observatory (LAGO) is a ground-based extended cosmic rays observatory designed to study transient astrophysical events, the role of the atmosphere on the formation of secondary particles, and space-weather-related phenomena. With the use of a network of Water Cherenkov Detectors (WCDs), LAGO measures the secondary particle flux, a consequence of the interaction of astroparticles impinging on the atmosphere of Earth. This flux can be grouped into three distinct basic constituents: electromagnetic, muonic, and hadronic components. When a particle enters a WCD, it generates a measurable signal characterized by unique features correlating to the particle’s type and the detector’s specific response. The resulting charge histograms from these signals provide valuable insights into the flux of primary astroparticles and their key characteristics. However, these data are insufficient to effectively distinguish between the contributions of different secondary particles. In this work, we extend our previous research by using detailed simulations of the expected atmospheric response to the primary flux and the corresponding response of our WCDs to atmospheric radiation. This dataset, which was created through the combination of the outputs of the ARTI and Meiga simulation frameworks, simulated the expected WCD signals produced by the flux of secondary particles during one day at the LAGO site in Bariloche, Argentina, situated at 865 m above sea level. This was achieved by analyzing the real-time magnetospheric and local atmospheric conditions for February and March of 2012, where the resultant atmospheric secondary-particle flux was integrated into a specific Meiga application featuring a comprehensive Geant4 model of the WCD at this LAGO location. The final output was modified for effective integration into our machine-learning pipeline. With an implementation of Ordering Points to Identify the Clustering Structure (OPTICS), a density-based clustering algorithm used to identify patterns in data collected by a single WCD, we have further refined our approach to implement a method that categorizes particle groups using advanced unsupervised machine learning techniques. This allowed for the differentiation among particle types and utilized the detector’s nuanced response to each, thus pinpointing the principal contributors within each group. Our analysis has demonstrated that applying our enhanced methodology can accurately identify the originating particles with a high degree of confidence on a single-pulse basis, highlighting its precision and reliability. These promising results suggest the feasibility of future implementations of machine-leaning-based models throughout LAGO’s distributed detection network and other astroparticle observatories for semi-automated, onboard and real-time data analysis.
Journal Article
Adapting Reproducible Research Capabilities to Resilient Distributed Calculations
by
Rodríguez-Pascual, Manuel
,
Mayo-García, Rafael
,
Rubio-Montero, Antonio Juan
in
Decoupling
,
Fault tolerance
,
Middleware
2016
Nowadays, computing calculations are becoming more and more demanding due to the huge pool of resources available. This demand must be satisfied in terms of computational efficiency and resilience, which is compromised in distributed and heterogeneous platforms. Not only this, data obtained are often either reused by other researchers or recalculated. In this work, a set of tools to overcome the problem of creating and executing fault tolerant distributed applications on dynamic environments is presented. Such a set also ensures the reproducibility of the performed experiments providing a portable, unattended and resilient framework that encapsulates the infrastructure-dependent operations away from the application developers and users, allowing performing experiments based on Open Access data repositories. In this way, users can seamlessly search and lately access datasets that can be automatically retrieved as input data into a code already integrated in the proposed workflow. Such a search is based on metadata standards and relies on Persistent Identifiers (PID) to assign specific repositories. The applications profit from Distributed Toolbox, a framework devoted to the creation and execution of distributed applications and includes tools for unattended cluster and grid execution, where a total fault tolerance is provided. By decoupling the definition of the remote tasks from its execution and control, the development, execution and maintenance of distributed applications is significantly simplified with respect to previous solutions, increasing their robustness and allowing running them on different computational platforms with little effort. The integration with Open Access databases and employment of PIDs for long-lasting references ensures that the data related to the experiments will persist, closing a complete research circle of data access/processing/storage/dissemination of results.
Journal Article
A Novel Cloud-Based Framework for Standardised Simulations in the Latin American Giant Observatory (LAGO)
by
Sidelnik, Iván
,
Pardo-Diaz, Alfonso
,
Mayo-García, Rafael
in
Astrophysics
,
Cerenkov counters
,
Cosmic rays
2022
LAGO, the Latin American Giant Observatory, is an extended cosmic ray observatory, consisting of a wide network of water Cherenkov detectors located in 10 countries. With different altitudes and geomagnetic rigidity cutoffs, their geographic distribution, combined with the new electronics for control, atmospheric sensing and data acquisition, allows the realisation of diverse astrophysics studies at a regional scale. It is an observatory designed, built and operated by the LAGO Collaboration, a non-centralised alliance of 30 institutions from 11 countries. While LAGO has access to different computational frameworks, it lacks standardised computational mechanisms to fully grasp its cooperative approach. The European Commission is fostering initiatives aligned to LAGO objectives, especially to enable Open Science and its long-term sustainability. This work introduces the adaptation of LAGO to this paradigm within the EOSC-Synergy project, focusing on the simulations of the expected astrophysical signatures at detectors deployed at the LAGO sites around the World.
The EOSC-Synergy cloud services implementation for the Latin American Giant Observatory (LAGO)
by
Sidelnik, Iván
,
Pagán-Muñoz, Raúl
,
Antonio Juan Rubio-Montero
in
Atmospherics
,
Cerenkov counters
,
Cloud computing
2021
The Latin American Giant Observatory (LAGO) is a distributed cosmic ray observatory at a regional scale in Latin America, by deploying a large network of Water Cherenkov detectors (WCD) and other astroparticle detectors in a wide range of latitudes from Antarctica to México, and altitudes from sea level to more than 5500 m a.s.l. Detectors telemetry, atmospherics conditions and flux of secondary particles at the ground are measured with extreme detail at each LAGO site by using our own-designed hardware and firmware (ACQUA). To combine and analyse all these data, LAGO developed ANNA, our data analysis framework. Additionally, ARTI, a complete framework of simulations designed to simulate the expected signals at our detectors coming from primary cosmic rays entering the Earth atmosphere, allowing a precise characterization of the sites in realistic atmospheric, geomagnetic and detector conditions. As the measured and synthetic data started to flow, we are facing challenging scenarios given a large amount of data emerging, performed on a diversity of detectors and computing architectures and e-infrastructures. These data need to be transferred, analyzed, catalogued, preserved, and provided for internal and public access and data-mining under an open e-science environment. In this work, we present the implementation of ARTI at the EOSC-Synergy cloud-based services as the first example of LAGO' frameworks that will follow the FAIR principles for provenance, data curation and re-using of data. For this, we calculate the flux of secondary particles expected in up to 1 week at detector level for all the 26 LAGO, and the 1-year flux of high energy secondaries expected at the ANDES Underground Laboratory and other sites. Therefore, we show how this development can help not only LAGO but other data-intensive cosmic rays observatories, muography experiments and underground laboratories.
Measurement of inelastic, single- and double-diffraction cross sections in proton–proton collisions at the LHC with ALICE
by
Andrei, C.
,
Colamaria, F.
,
Marchisone, M.
in
Astronomy
,
Astrophysics and Cosmology
,
Beams (radiation)
2013
Measurements of cross sections of inelastic and diffractive processes in proton–proton collisions at LHC energies were carried out with the ALICE detector. The fractions of diffractive processes in inelastic collisions were determined from a study of gaps in charged particle pseudorapidity distributions: for single diffraction (diffractive mass
M
X
<200 GeV/
c
2
)
, and
, respectively at centre-of-mass energies
; for double diffraction (for a pseudorapidity gap Δ
η
>3)
σ
DD
/
σ
INEL
=0.11±0.03,0.12±0.05, and
, respectively at
. To measure the inelastic cross section, beam properties were determined with van der Meer scans, and, using a simulation of diffraction adjusted to data, the following values were obtained:
mb at
and
at
. The single- and double-diffractive cross sections were calculated combining relative rates of diffraction with inelastic cross sections. The results are compared to previous measurements at proton–antiproton and proton–proton colliders at lower energies, to measurements by other experiments at the LHC, and to theoretical models.
Journal Article
Energy dependence of forward-rapidity J/ψ and ψ(2S) production in pp collisions at the LHC
by
Andrei, C.
,
Alexandre, D.
,
Albuquerque, D. S. D.
in
Astronomy
,
Astrophysics and Cosmology
,
Collisions
2017
We present results on transverse momentum (
p
T
) and rapidity (
y
) differential production cross sections, mean transverse momentum and mean transverse momentum square of inclusive
J
/
ψ
and
ψ
(
2
S
)
at forward rapidity (
2.5
<
y
<
4
) as well as
ψ
(
2
S
)
-to-
J
/
ψ
cross section ratios. These quantities are measured in pp collisions at center of mass energies
s
=
5.02
and 13 TeV with the ALICE detector. Both charmonium states are reconstructed in the dimuon decay channel, using the muon spectrometer. A comprehensive comparison to inclusive charmonium cross sections measured at
s
=
2.76
, 7 and 8 TeV is performed. A comparison to non-relativistic quantum chromodynamics and fixed-order next-to-leading logarithm calculations, which describe prompt and non-prompt charmonium production respectively, is also presented. A good description of the data is obtained over the full
p
T
range, provided that both contributions are summed. In particular, it is found that for
p
T
>
15
GeV/
c
the non-prompt contribution reaches up to 50% of the total charmonium yield.
Journal Article
Convergent data-driven workflows for open radiation calculations: an exportable methodology to any field
by
Suárez-Durán, Mauricio
,
Rubio-Montero, Antonio Juan
,
Carretero, Manuel
in
Astrophysics
,
Atmosphere
,
Automation
2025
The fast growth worldwide of linkable scientific datasets supposes significant challenges in their management and reuse. Large experiments, such as the Latin American Giant Observatory, generate volumes of data that can benefit other kinds of studies. In this sense, there is a modular ecosystem of external radiation tools that should harvest and supply datasets without being part of the main pipeline. Workflows for personal dose estimation, muongraphy in volcanology or mining, or aircraft dose calculations are built with different privacy policies and exploitation licenses. Every numerical method has its own requirements and only parts could make use of the Collaboration’s resources, which implies the convergence with other computing infrastructures. Our work focuses on developing an agnostic methodology to address these challenges while promoting open science. Leveraging the encapsulation of software in nested containers, where the inner layers accomplish specific standardization slices and calculations, the wrapper compiles metadata and data generated and publishes them. All this allows researchers to build a data-driven computer continuum that complies with the findable, accessible, interoperable, and reusable principles. The approach has been successfully tested in the computer-demanding field of radiation-matter interaction with humans, showing the orchestration with the regular pipeline for diverse applications. Moreover, it has been integrated into public or federated cloud environments as well as into local clusters and personal computers to ensure the portability and scalability of the simulations. We postulate that this successful use case can be customized to any other field.
Journal Article
Convergent data-driven workflows for open radiation calculations: an exportable methodology to any field
by
Suárez-Durán, Mauricio
,
Carretero, Manuel
,
Mayo-García, Rafael
in
Compilers
,
Computer Science
,
Interpreters
2025
The fast growth worldwide of linkable scientific datasets supposes significant challenges in their management and reuse. Large experiments, such as the Latin American Giant Observatory, generate volumes of data that can benefit other kinds of studies. In this sense, there is a modular ecosystem of external radiation tools that should harvest and supply datasets without being part of the main pipeline. Workflows for personal dose estimation, muongraphy in volcanology or mining, or aircraft dose calculations are built with different privacy policies and exploitation licenses. Every numerical method has its own requirements and only parts could make use of the Collaboration’s resources, which implies the convergence with other computing infrastructures. Our work focuses on developing an agnostic methodology to address these challenges while promoting open science. Leveraging the encapsulation of software in nested containers, where the inner layers accomplish specific standardization slices and calculations, the wrapper compiles metadata and data generated and publishes them. All this allows researchers to build a data-driven computer continuum that complies with the findable, accessible, interoperable, and reusable principles. The approach has been successfully tested in the computer-demanding field of radiation-matter interaction with humans, showing the orchestration with the regular pipeline for diverse applications. Moreover, it has been integrated into public or federated cloud environments as well as into local clusters and personal computers to ensure the portability and scalability of the simulations. We postulate that this successful use case can be customized to any other field.
Journal Article
Response of HPC hardware to neutron radiation at the dawn of exascale
by
Bustos, Andrés
,
Mayo-García, Rafael
,
Campo, Xandra
in
Algorithms
,
Californium isotopes
,
Compilers
2023
Every computation presents a small chance that an unexpected phenomenon ruins or modifies its output. Computers are prone to errors that, although may be very unlikely, are hard, expensive or simply impossible to avoid. In the exascale, with thousands of processors involved in a single computation, those errors are especially harmful because they can corrupt or distort the results, wasting human and material resources. In the present work, we study the effect of ionizing radiation on several pieces of commercial hardware, very common in modern supercomputers. Aiming to reproduce the natural radiation that could arise, CPUs (Xeon, EPYC) and GPUs (A100, V100, T4) are subject to a known flux of neutrons coming from two radioactive sources, namely
252
Cf and
241
Am-Be, in a special irradiation facility. The working hardware is irradiated under supervision to quantify any appearing error. Once the hardware response is characterised, we are able to scale down the radiation intensity and to estimate the effects on standard data centres. This can help administrators and researchers to develop their contingency plans and protocols.
Journal Article
Multiplicity and transverse momentum evolution of charge-dependent correlations in pp, p–Pb, and Pb–Pb collisions at the LHC
by
Andrei, C.
,
Alexandre, D.
,
Colamaria, F.
in
Astronomy
,
Astrophysics and Cosmology
,
Elementary Particles
2016
We report on two-particle charge-dependent correlations in pp, p–Pb, and Pb–Pb collisions as a function of the pseudorapidity and azimuthal angle difference,
Δ
η
and
Δ
φ
respectively. These correlations are studied using the balance function that probes the charge creation time and the development of collectivity in the produced system. The dependence of the balance function on the event multiplicity as well as on the trigger and associated particle transverse momentum (
p
T
) in pp, p–Pb, and Pb–Pb collisions at
s
NN
=
7, 5.02, and 2.76 TeV, respectively, are presented. In the low transverse momentum region, for
0.2
<
p
T
<
2.0
GeV/
c
, the balance function becomes narrower in both
Δ
η
and
Δ
φ
directions in all three systems for events with higher multiplicity. The experimental findings favor models that either incorporate some collective behavior (e.g. AMPT) or different mechanisms that lead to effects that resemble collective behavior (e.g. PYTHIA8 with color reconnection). For higher values of transverse momenta the balance function becomes even narrower but exhibits no multiplicity dependence, indicating that the observed narrowing with increasing multiplicity at low
p
T
is a feature of bulk particle production.
Journal Article