Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
12
result(s) for
"Danovaro, Emanuele"
Sort by:
Polytope: an algorithm for efficient feature extraction on hypercubes
2025
Data extraction algorithms on data hypercubes, or datacubes, are traditionally only capable of cutting boxes of data along the datacube axes. For many use cases however, this returns much more data than users actually need, leading to an unnecessary consumption of I/O resources. In this paper, we propose an alternative feature extraction technique, which carefully computes the indices of data points contained within user-requested shapes. This enables data storage systems to only read and return bytes useful to user applications from the datacube. Our main algorithm is based on high-dimensional computational geometry concepts and operates by successively reducing polytopes down to the points contained within them. We analyse this algorithm in detail before providing results about its performance and scalability. In particular, we show it is possible to achieve data reductions of up to 99% using this algorithm instead of current state of practice data extraction methods, such as meteorological field extractions from ECMWF’s FDB data store, where feature shapes are extracted a posteriori as a post-processing step. As we discuss later on, this novel extraction method will considerably help scale access to large petabyte size data hypercubes in a variety of scientific fields.
Journal Article
Towards standard metadata to support models and interfaces in a hydro-meteorological model chain
2015
This paper seeks to move towards an un-encoded metadata standard supporting the description of environmental numerical models and their interfaces with other such models. Building on formal metadata standards and supported by the local standards applied by modelling frameworks, the desire is to produce a solution, which is as simple as possible yet meets the requirements to support model coupling processes. The purpose of this metadata is to allow environmental numerical models, with a first application for a hydro-meteorological model chain, to be discovered and then an initial evaluation made of their suitability for use, in particular for integrated model compositions. The method applied is to begin with the ISO19115 standard and add extensions suitable for environmental numerical models in general. Further extensions are considered pertaining to model interface parameters (or phenomena) together with spatial and temporal characteristics supported by feature types from climate science modelling language. Successful validation of parameters depends heavily on the existence of controlled vocabularies. The metadata structure formulated has been designed to strike the right balance between simplicity and supporting the purposes drawn out by interfacing the Real-time Interactive Basin Simulator hydrological model to meteorological and hydraulic models and, as such, successfully provides an initial level of information to the user.
Journal Article
The Destination Earth digital twin for climate change adaptation
by
Müller, Sebastian
,
Nurisso, Matteo
,
Quintino, Tiago
in
Adaptation
,
Artificial intelligence
,
Automation
2026
The Climate Change Adaptation Digital Twin (Climate DT), developed as part of the European Commission's Destination Earth (DestinE) initiative, sets up an operational system for producing multi-decadal, multi-model global climate projections and translating climate data into climate impact information to support adaptation efforts. This system delivers data with local granularity at spatial resolutions of 5–10 km and hourly outputs, leading to globally consistent information at scales that matter for decision-making. It also enables the testing of what-if scenarios such as high-resolution storylines, which are physically consistent global simulations of extreme events under different climate conditions and provide contextual insights to support concrete adaptation decisions. They support the generation of more equitable (understood as accessible and relevant across regions) climate information. The Climate DT is built on cutting-edge infrastructure, expert collaboration, and digital innovation. It is designed to support on-demand responses to policy questions, with quantified uncertainty. It will foster interactivity by allowing users to influence simulation design, model output portfolios, and application integration through co-design. AI-based tools, including emulators and chatbots, are being developed in parallel to enhance climate information access. Sector-specific applications are embedded in the system to synchronously translate climate data into tailored climate-impact indicators, with examples provided for energy, water, and forest management. The applications have been co-designed with informed users. A unified, cross-platform workflow defines the orchestration of all components, which is handled by a single workflow manager and relies on containerised components, facilitating automation, portability, maintainability, and traceability. Data management is unified using standard grids (HEALPix), ensuring consistency and easing data usability under a strict governance policy. Streaming enables real-time data use by the data consumers and unlocks access to the unprecedented data wealth produced by the high-resolution simulations. Monitoring tools provide real-time quality control of data and model outputs and enable continuous assessment of the realism of the climate simulations during Climate DT operation. The compute-intensive system is powered by world-class supercomputing capabilities through a strategic partnership with the European High Performance Computing Joint Undertaking (EuroHPC). Despite high computational demands, the Climate DT sets a new benchmark for delivering equitable, credible, and actionable climate information. It complements existing initiatives like CMIP, CORDEX, and national and European climate services, and aligns with global climate science goals to support climate adaptation.
Journal Article
DRIHM(2US)
by
Clematis, Andrea
,
Galizia, Antonella
,
Siccardi, Franco
in
Applied mathematics
,
Atmospheric models
,
Climate change
2017
From 1970 to 2012, about 9,000 high-impact weather events were reported globally, causing the loss of 1.94 million lives and damage of $2.4 trillion (U.S. dollars). The scientific community is called to action to improve the predictive ability of such events and communicate forecasts and associated risks both to affected populations and to those making decisions. At the heart of this challenge lies the ability to have easy access to hydrometeorological data and models and to facilitate the necessary collaboration between meteorologists, hydrologists, and computer science experts to achieve accelerated scientific advances. Two European Union (EU)-funded projects, Distributed Research Infrastructure for Hydro-Meteorology (DRIHM) and DRIHM to United States of America (DRIHM2US), sought to help address this challenge by developing a prototype e-science environment providing advanced end-to-end services (models, datasets, and postprocessing tools), with the aim of paving the way to a step change in how scientists can approach studying these events, with a special focus on flood events in complex topographic areas. This paper describes the motivation and philosophy behind this prototype e-science environment together with certain key components, focusing on hydrometeorological aspects that are then illustrated through actionable research for a critical flash flood event that occurred in October 2014 in Liguria, Italy.
Journal Article
From Lesson Learned to the Refactoring of the DRIHM Science Gateway for Hydro-meteorological Research
by
Clematis, Andrea
,
Galizia, Antonella
,
Roverelli, Luca
in
Computer programs
,
Computer Science
,
Computer simulation
2016
A full hydro-meteorological (HM) simulation, from rainfall to impact on urban areas, is a multidisciplinary activity which consists in the execution of a workflow composed by complex and heterogeneous model engines. Moreover an extensive set of configuration parameters have to be selected consistently among the models, otherwise the simulation can fail or produce unreliable results. The DRIHM portal is a Web-based science gateway aiming to support HM researchers in designing, executing and managing HM simulations. The first version of the portal was developed during the DRIHM project using the gUSE science gateway toolkit. The lesson we learned is guiding a refactoring process that, together with a review of the most relevant technologies for the development of a science gateway, represent the focus of this paper. Beside the technological aspects, the need of a strong interplay between ICT and other domain-specific communities clearly emerged, together with coherent policies in the management of data, computational resources and software components that represent the ecosystem of a science gateways.
Journal Article
Polytope: An Algorithm for Efficient Feature Extraction on Hypercubes
by
Leuridan, Mathilde
,
Quintino, Tiago
,
Smart, Simon
in
Algorithms
,
Computational geometry
,
Feature extraction
2023
Data extraction algorithms on data hypercubes, or datacubes, are traditionally only capable of cutting boxes of data along the datacube axes. For many use cases however, this is not a sufficient approach and returns more data than users might actually need. This not only forces users to apply post-processing after extraction, but more importantly this consumes more I/O resources than is necessary. When considering very large datacubes from which users only want to extract small non-rectangular subsets, the box approach does not scale well. Indeed, with this traditional approach, I/O systems quickly reach capacity, trying to read and return unwanted data to users. In this paper, we propose a novel technique, based on computational geometry concepts, which instead carefully pre-selects the precise bytes of data which the user needs in order to then only read those from the datacube. As we discuss later on, this novel extraction method will considerably help scale access to large petabyte size data hypercubes in a variety of scientific fields.
Exploring DAOS Interfaces and Performance
by
Smart, Simon D
,
Quintino, Tiago
,
Jackson, Adrian
in
Application programming interface
,
Input output analysis
,
Memory devices
2024
Distributed Asynchronous Object Store (DAOS) is a novel software-defined object store leveraging Non-Volatile Memory (NVM) devices, designed for high performance. It provides a number of interfaces for applications to undertake I/O, ranging from a native object storage API to a DAOS FUSE module for seamless compatibility with existing applications using POSIX file system APIs. In this paper we discuss these interfaces and the options they provide, exercise DAOS through them with various I/O benchmarks, and analyse the observed performance. We also briefly compare the performance with a distributed file system and another object storage system deployed on the same hardware, and showcase DAOS' potential and increased flexibility to support high-performance I/O.
Reducing the Impact of I/O Contention in Numerical Weather Prediction Workflows at Scale Using DAOS
by
Smart, Simon D
,
Quintino, Tiago
,
Jackson, Adrian
in
Data storage
,
Distributed memory
,
Machine learning
2024
Operational Numerical Weather Prediction (NWP) workflows are highly data-intensive. Data volumes have increased by many orders of magnitude over the last 40 years, and are expected to continue to do so, especially given the upcoming adoption of Machine Learning in forecast processes. Parallel POSIX-compliant file systems have been the dominant paradigm in data storage and exchange in HPC workflows for many years. This paper presents ECMWF's move beyond the POSIX paradigm, implementing a backend for their storage library to support DAOS -- a novel high-performance object store designed for massively distributed Non-Volatile Memory. This system is demonstrated to be able to outperform the highly mature and optimised POSIX backend when used under high load and contention, as per typical forecast workflow I/O patterns. This work constitutes a significant step forward, beyond the performance constraints imposed by POSIX semantics.
DAOS as HPC Storage, a view from Numerical Weather Prediction
by
Smart, Simon D
,
Quintino, Tiago
,
Jackson, Adrian
in
Benchmarks
,
Configuration management
,
Numerical prediction
2023
Object storage solutions potentially address long-standing performance issues with POSIX file systems for certain I/O workloads, and new storage technologies offer promising performance characteristics for data-intensive use cases. In this work, we present a preliminary assessment of Intel's Distributed Asynchronous Object Store (DAOS), an emerging high-performance object store, in conjunction with non-volatile storage and evaluate its potential use for HPC storage. We demonstrate DAOS can provide the required performance, with bandwidth scaling linearly with additional DAOS server nodes in most cases, although choices in configuration and application design can impact achievable bandwidth. We describe a new I/O benchmark and associated metrics that address object storage performance from application-derived workloads.
OpenCUBE: Building an Open Source Cloud Blueprint with EPI Systems
by
Schulz, Martin
,
Marcuello, Pedro
,
Wahlgren, Jacob
in
Cloud computing
,
Molecular docking
,
Open source software
2024
OpenCUBE aims to develop an open-source full software stack for Cloud computing blueprint deployed on EPI hardware, adaptable to emerging workloads across the computing continuum. OpenCUBE prioritizes energy awareness and utilizes open APIs, Open Source components, advanced SiPearl Rhea processors, and RISC-V accelerator. The project leverages representative workloads, such as cloud-native workloads and workflows of weather forecast data management, molecular docking, and space weather, for evaluation and validation.