Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
6
result(s) for
"Pardo-Diaz, Alfonso"
Sort by:
CardioGRID: a framework for the analysis of cardiological signals in GRID computing
by
Pollan, Raul Ramos
,
Garcia Eijo, Juan Francisco
,
Risk, Marcelo
in
Analytics
,
Computational grids
,
Data analysis
2011
The present paper describes the development of the CardioGRID framework into the GRID infrastructure. The core GRID services; Workload Management System (WMS), Data Management System and Grid Authentication have been implemented. Additionally, a web-based tool -the CardioGRID portal- has been developed to facilitate the user interaction with the GRID. As a result, the user is able to process the electrocardiogram (ECG) signals obtained form a portable data acquisition device and to process it on the GRID. Once the CardioGRID portal is prompted and the user identity is verified through a digital X.509 certificate, the operator may either upload new raw ECG data to the GRID Storage Elements or use already stored data. Then, subsequent analytics from these data are performed as GRID jobs and relevant medical quantities are derived through middle-ware job retrieval mechanism. In summary in this paper was described the development of a medical GRID based system, and its integration to an existing platform for Digital Repositories Infrastructure.
Journal Article
Hyperspectral Image Analysis Using Cloud-Based Support Vector Machines
by
Haut, Juan M.
,
Franco-Valiente, Jose M.
,
Paoletti, Mercedes E.
in
Algorithms
,
Classification
,
Computer Imaging
2024
Hyperspectral image processing techniques involve time-consuming calculations due to the large volume and complexity of the data. Indeed, hyperspectral scenes contain a wealth of spatial and spectral information thanks to the hundreds of narrow and continuous bands collected across the electromagnetic spectrum. Predictive models, particularly supervised machine learning classifiers, take advantage of this information to predict the pixel categories of images through a training set of real observations. Most notably, the Support Vector Machine (SVM) has demonstrate impressive accuracy results for image classification. Notwithstanding the performance offered by SVMs, dealing with such a large volume of data is computationally challenging. In this paper, a scalable and high-performance cloud-based approach for distributed training of SVM is proposed. The proposal address the overwhelming amount of remote sensing (RS) data information through a parallel training allocation. The implementation is performed over a memory-efficient Apache Spark distributed environment. Experiments are performed on a benchmark of real hyperspectral scenes to show the robustness of the proposal. Obtained results demonstrate efficient classification whilst optimising data processing in terms of training times.
Journal Article
Direct nanopore sequencing of M. tuberculosis on sputa and rescue of suboptimal results to enhance transmission surveillance
2025
Whole-genome sequencing (WGS) enhances precision in predicting antimicrobial resistance and tracking
(MTB) transmission. Due to MTB's slow-growing nature, genomic results are delayed; however, few efforts have sought to accelerate them by performing WGS directly on respiratory specimens. Most culture-free efforts have focused on accelerating resistance prediction. The present study provides further evidence to the only preceding study aiming to accelerate precise delineation of transmission, coupling culture-free WGS to a surveillance programme. Our study is distinguished from its predecessor by being the first to apply flexible nanopore sequencing to further accelerate the process. A total of 71 sputa were selected, in which we applied only a procedure to deplete human DNA, thus avoiding costly and cumbersome capture-bait alternatives. Optimal results (>90% genome covered, mean coverage >45× and >70% genome covered >20×) were obtained from 33.8% of cases, allowing the assignment to transmission clusters close to diagnosis of every new case. A further 12.6% of samples yielded suboptimal results (15.5%-90.92% at >10×), which were exploited through a rescue pipeline. This approach was based on identifying informative SNPs acting as markers for relevant transmission clusters in our population. The pipeline enabled pre-allocation of new cases to pre-existing clusters and, in some cases, precise genomic relationships with the preceding cases in the cluster. In summary, this study demonstrates that epidemiologically valuable information can be obtained directly from sputum in approximately half the samples analysed. It represents a new advancement in the pursuit of faster comparative genomics, with epidemiological purposes, at diagnosis.
Journal Article
A Novel Cloud-Based Framework for Standardised Simulations in the Latin American Giant Observatory (LAGO)
by
Sidelnik, Iván
,
Pardo-Diaz, Alfonso
,
Mayo-García, Rafael
in
Astrophysics
,
Cerenkov counters
,
Cosmic rays
2022
LAGO, the Latin American Giant Observatory, is an extended cosmic ray observatory, consisting of a wide network of water Cherenkov detectors located in 10 countries. With different altitudes and geomagnetic rigidity cutoffs, their geographic distribution, combined with the new electronics for control, atmospheric sensing and data acquisition, allows the realisation of diverse astrophysics studies at a regional scale. It is an observatory designed, built and operated by the LAGO Collaboration, a non-centralised alliance of 30 institutions from 11 countries. While LAGO has access to different computational frameworks, it lacks standardised computational mechanisms to fully grasp its cooperative approach. The European Commission is fostering initiatives aligned to LAGO objectives, especially to enable Open Science and its long-term sustainability. This work introduces the adaptation of LAGO to this paradigm within the EOSC-Synergy project, focusing on the simulations of the expected astrophysical signatures at detectors deployed at the LAGO sites around the World.
The EOSC-Synergy cloud services implementation for the Latin American Giant Observatory (LAGO)
by
Sidelnik, Iván
,
Pagán-Muñoz, Raúl
,
Antonio Juan Rubio-Montero
in
Atmospherics
,
Cerenkov counters
,
Cloud computing
2021
The Latin American Giant Observatory (LAGO) is a distributed cosmic ray observatory at a regional scale in Latin America, by deploying a large network of Water Cherenkov detectors (WCD) and other astroparticle detectors in a wide range of latitudes from Antarctica to México, and altitudes from sea level to more than 5500 m a.s.l. Detectors telemetry, atmospherics conditions and flux of secondary particles at the ground are measured with extreme detail at each LAGO site by using our own-designed hardware and firmware (ACQUA). To combine and analyse all these data, LAGO developed ANNA, our data analysis framework. Additionally, ARTI, a complete framework of simulations designed to simulate the expected signals at our detectors coming from primary cosmic rays entering the Earth atmosphere, allowing a precise characterization of the sites in realistic atmospheric, geomagnetic and detector conditions. As the measured and synthetic data started to flow, we are facing challenging scenarios given a large amount of data emerging, performed on a diversity of detectors and computing architectures and e-infrastructures. These data need to be transferred, analyzed, catalogued, preserved, and provided for internal and public access and data-mining under an open e-science environment. In this work, we present the implementation of ARTI at the EOSC-Synergy cloud-based services as the first example of LAGO' frameworks that will follow the FAIR principles for provenance, data curation and re-using of data. For this, we calculate the flux of secondary particles expected in up to 1 week at detector level for all the 26 LAGO, and the 1-year flux of high energy secondaries expected at the ANDES Underground Laboratory and other sites. Therefore, we show how this development can help not only LAGO but other data-intensive cosmic rays observatories, muography experiments and underground laboratories.
A Ubiquitous Sensor Network Platform for Integrating Smart Devices into the Semantic Sensor Web
by
Díaz Pardo de Vera, David
,
Sigüenza Izquierdo, Álvaro
,
Bernat Vercher, Jesús
in
Automotive engineering
,
Consumption
,
Demand
2014
Ongoing Sensor Web developments make a growing amount of heterogeneous sensor data available to smart devices. This is generating an increasing demand for homogeneous mechanisms to access, publish and share real-world information. This paper discusses, first, an architectural solution based on Next Generation Networks: a pilot Telco Ubiquitous Sensor Network (USN) Platform that embeds several OGC® Sensor Web services. This platform has already been deployed in large scale projects. Second, the USN-Platform is extended to explore a first approach to Semantic Sensor Web principles and technologies, so that smart devices can access Sensor Web data, allowing them also to share richer (semantically interpreted) information. An experimental scenario is presented: a smart car that consumes and produces real-world information which is integrated into the Semantic Sensor Web through a Telco USN-Platform. Performance tests revealed that observation publishing times with our experimental system were well within limits compatible with the adequate operation of smart safety assistance systems in vehicles. On the other hand, response times for complex queries on large repositories may be inappropriate for rapid reaction needs.
Journal Article