Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
66
result(s) for
"Carrion, Daniela"
Sort by:
A Spatiotemporal Drought Analysis Application Implemented in the Google Earth Engine and Applied to Iran as a Case Study
2023
Drought is a major problem in the world and has become more severe in recent decades, especially in arid and semi-arid regions. In this study, a Google Earth Engine (GEE) app has been implemented to monitor spatiotemporal drought conditions over different climatic regions. The app allows every user to perform analysis over a region and for a period of their choice, benefiting from the huge GEE dataset of free and open data as well as from its fast cloud-based computation. The app implements the scaled drought condition index (SDCI), which is a combination of three indices: the vegetation condition index (VCI), temperature condition index (TCI), and precipitation condition index (PCI), derived or calculated from satellite imagery data through the Google Earth Engine platform. The De Martonne climate classification index has been used to derive the climate region; within each region the indices have been computed separately. The test case area is over Iran, which shows a territory with high climate variability, where drought has been explored for a period of 11 years (from 2010 to 2021) allowing us to cover a reasonable time series with the data available in the Google Earth Engine. The developed tool allowed the singling-out of drought events over each climate, offering both the spatial and temporal representation of the phenomenon and confirming results found in local and global reports.
Journal Article
Hazard Susceptibility Mapping with Machine and Deep Learning: A Literature Review
by
Brovelli, Maria Antonia
,
Carrion, Daniela
,
Pugliese Viloria, Angelly de Jesus
in
Air pollution
,
Algorithms
,
Artificial neural networks
2024
With the increase in climate-change-related hazardous events alongside population concentration in urban centres, it is important to provide resilient cities with tools for understanding and eventually preparing for such events. Machine learning (ML) and deep learning (DL) techniques have increasingly been employed to model susceptibility of hazardous events. This study consists of a systematic review of the ML/DL techniques applied to model the susceptibility of air pollution, urban heat islands, floods, and landslides, with the aim of providing a comprehensive source of reference both for techniques and modelling approaches. A total of 1454 articles published between 2020 and 2023 were systematically selected from the Scopus and Web of Science search engines based on search queries and selection criteria. ML/DL techniques were extracted from the selected articles and categorised using ad hoc classification. Consequently, a general approach for modelling the susceptibility of hazardous events was consolidated, covering the data preprocessing, feature selection, modelling, model interpretation, and susceptibility map validation, along with examples of related global/continental data. The most frequently employed techniques across various hazards include random forest, artificial neural networks, and support vector machines. This review also provides, per hazard, the definition, data requirements, and insights into the ML/DL techniques used, including examples of both state-of-the-art and novel modelling approaches.
Journal Article
VGI and Satellite Imagery Integration for Crisis Mapping of Flood Events
by
Carrion, Daniela
,
Vavassori, Alberto
,
Zaragozi, Benito
in
Airborne radar
,
Airborne remote sensing
,
Airborne sensing
2022
Timely mapping of flooded areas is critical to several emergency management tasks including response and recovery activities. In fact, flood crisis maps embed key information for an effective response to the natural disaster by delineating its spatial extent and impact. Crisis mapping is usually carried out by leveraging data provided by satellite or airborne optical and radar sensors. However, the processing of these kinds of data demands experienced visual interpretation in order to achieve reliable results. Furthermore, the availability of in situ observations is crucial for the production and validation of crisis maps. In this context, a frontier challenge consists in the use of Volunteered Geographic Information (VGI) as a complementary in situ data source. This paper proposes a procedure for flood mapping that integrates VGI and optical satellite imagery while requiring limited user intervention. The procedure relies on the classification of multispectral images by exploiting VGI for the semi-automatic selection of training samples. The workflow has been tested with photographs and videos shared on social media (Twitter, Flickr, and YouTube) during two flood events and classification consistency with reference products shows promising results (with Overall Accuracy ranging from 87% to 93%). Considering the limitations of social media-sourced photos, the use of QField is proposed as a dedicated application to collect metadata needed for the image classification. The research results show that the integration of high-quality VGI data and semi-automatic data processing can be beneficial for crisis map production and validation, supporting crisis management with up-to-date maps.
Journal Article
Computing the Deflection of the Vertical for Improving Aerial Surveys: A Comparison between EGM2008 and ITALGEO05 Estimates
by
Barzaghi, Riccardo
,
Carrion, Daniela
,
Pepe, Massimiliano
in
Aerial surveys
,
Computation
,
Deflection
2016
Recent studies on the influence of the anomalous gravity field in GNSS/INS applications have shown that neglecting the impact of the deflection of vertical in aerial surveys induces horizontal and vertical errors in the measurement of an object that is part of the observed scene; these errors can vary from a few tens of centimetres to over one meter. The works reported in the literature refer to vertical deflection values based on global geopotential model estimates. In this paper we compared this approach with the one based on local gravity data and collocation methods. In particular, denoted by ξ and η, the two mutually-perpendicular components of the deflection of the vertical vector (in the north and east directions, respectively), their values were computed by collocation in the framework of the Remove-Compute-Restore technique, applied to the gravity database used for estimating the ITALGEO05 geoid. Following this approach, these values have been computed at different altitudes that are relevant in aerial surveys. The (ξ, η) values were then also estimated using the high degree EGM2008 global geopotential model and compared with those obtained in the previous computation. The analysis of the differences between the two estimates has shown that the (ξ, η) global geopotential model estimate can be reliably used in aerial navigation applications that require the use of sensors connected to a GNSS/INS system only above a given height (e.g., 3000 m in this paper) that must be defined by simulations.
Journal Article
Collocation and FFT-based geoid estimation within the Colorado 1 cm geoid experiment
2021
In the frame of the International Association of Geodesy Joint Working Group 2.2.2 “The 1 cm geoid experiment”, terrestrial and airborne gravity datasets along with GPS/leveling data were made available for the comparison of different geoid modeling methods and techniques in the wider area of Colorado, USA. We discuss the methods and procedures we followed for computing gravimetric quasi-geoid and geoid models and geopotential values from the available datasets. The procedures followed were based on the remove-compute-restore approach using XGM2016 as a reference geopotential model. The higher frequencies of the gravity field were computed via the residual terrain correction, using (a) the CGIAR-CSI SRTM digital elevation model with the classical technique and (b) a spectral one. Least-Squares Collocation was used for the downward continuation of the airborne data and for gridding. Finally, the geoid models were obtained by applying Least-Squares Collocation and spherical FFT-based methods, while the influence of the orthometric height correction on geoid heights was taken into account by employing simple and complete Bouguer reductions. All results were evaluated with available GPS/leveling benchmarks. Moreover, potential values were determined in support of the International Height Reference System/Frame. From the results acquired, a final accuracy of 5–7 cm for the determined geoid models was achieved depending on the adopted method and data combination, without considering the accuracy of the GPS/leveling data used for their evaluation. The contribution of the airborne gravity data was deemed as limited in combination solutions although the airborne only solution provided equal level of accuracy to the terrestrial and the combined ones. Better consistency was obtained on the points of the GSVS17 line, when compared to the GPS/leveling data, where an accuracy of 2.4 cm and 2.8 cm was reached for the FFT and LSC based methods, respectively.
Journal Article
A 3D WebGIS Open-Source Prototype for Bridge Inspection Data Management
by
Pinto, Livio
,
Fascia, Rebecca
,
Carrion, Daniela
in
Access control
,
bridge inspection
,
Bridges
2025
In response to the increasing demand for effective bridge management and the shortcomings of current proprietary solutions, this work presents an open-source, web-based platform designed to support bridge inspection and data management, particularly for small and medium-sized public administrations, which often lack personnel or funding for implementing context-specific tools. The system addresses fragmented workflows by integrating multi-format geospatial and 3D data—such as point clouds, CAD/BIM models, and georeferenced imagery—within a unified, modular architecture. The platform enables structured inventory, interactive 2D/3D visualization, defect annotation, and role-based user interaction, aligning with FAIR principles and interoperability standards. Built entirely with free and open-source tools, the P.O.N.T.I. prototype ensures scalability, transparency, and adaptability. A multi-layer navigation interface guides users through asset exploration, inspection history, and immersive 3D viewers. Fully documented and publicly available on GitHub, the system allows for deployment across varying institutional contexts. The platform’s design anticipates future developments, including integration with IoT monitoring systems, AI-driven inspection tools, and chatbot interfaces for natural language querying. By overcoming existing proprietary limitations and providing access to a versatile single space, the proposed solution supports decision-makers in the digital transition towards a more accessible, transparent and integrated infrastructure asset management.
Journal Article
Open access to regional geoid models: the International Service for the Geoid
by
Barzaghi, Riccardo
,
Sansó, Fernando
,
Carrion, Daniela
in
Access
,
Algorithms
,
Archives & records
2021
The International Service for the Geoid (ISG, https://www.isgeoid.polimi.it/, last access: 31 March 2021) provides free access to a dedicated and comprehensive repository of geoid models through its website. In the archive, both the latest releases of the most important and well-known geoid models, as well as less recent or less known ones, are freely available, giving to the users a wide range of possible applications to perform analyses on the evolution of the geoid computation research field. The ISG is an official service of the International Association of Geodesy (IAG), under the umbrella of the International Gravity Field Service (IGFS). Its main tasks are collecting, analysing, and redistributing local, regional, and continental geoid models and providing technical support to people involved in geoid-related topics for both educational and research purposes. In the framework of its activities, the ISG performs research taking advantage of its archive and organizes seminars and specific training courses on geoid determination, supporting students and researchers in geodesy as well as distributing training material on the use of the most common algorithms for geoid estimation. This paper aims at describing the data and services, including the newly implemented DOI Service for geoid models (https://dataservices.gfz-potsdam.de/portal/?fq=subject:isg, last access: 31 March 2021), and showing the added value of the ISG archive of geoid models for the scientific community and technicians, like engineers and surveyors (https://www.isgeoid.polimi.it/Geoid/reg_list.html, last access: 31 March 2021).
Journal Article
Glacier responses to recent volcanic activity in Southern Chile
Glaciers in Southern Chile (39-43°S) are characterized by frontal retreats and area losses in response to the ongoing climatic changes at a timescale of decades. Superimposed on these longer-term trends, volcanic activity is thought to impact glaciers in variable ways. Debris-ash covered Glaciar Pichillancahue-Turbio only retreated slightly in recent decades in spite of been located on Volcán Villarrica which has experienced increased volcanic activity since 1977. In contrast, the negative long-term Volcán Michinmahuida glacier area trend reversed shortly before the beginning of the explosive eruption of nearby Volcán Chaitén in May 2008, when Glaciar Amarillo advanced and a lahar type of mudflow was observed. This advancing process is analysed in connection to the nearby eruption, producing albedo changes at Michinmahuida glaciers, as well as a possible enhanced basal melting from higher geothermal flux. Deconvolution of glacier responses due to these processes is difficult and probably not possible with available data. Much more work and data are required to determine the causes of present glacier behaviour.
Journal Article
A prototype HGIS for managing earthquake data from historical documents
2019
Studies regarding historical seismic events occurred during the pre-instrumental era are mostly based on the interpretation of coeval records reporting earthquake effects on humans and buildings as experienced and reported by witnesses. Historical sources typically consist of written documents such as letters, newspapers articles, chronicles and memoirs that survived the passage of time; from these documents and their historical context, seismologists isolate the relevant descriptive information on the effects of a seismic event in a place. This information is required to estimate a macroseismic intensity, in turn, used as input to assess the earthquake parameters such as the epicentre location and magnitude. Historical seismologists feel the need of a system for organizing the huge amount of data retrieved in their research, and able to keep trace of the complex relations among these data. A tool addressing these needs may well be used to perform the opposite action: being able to trace back each step of the research procedure, enabling seismologists checking the reliability of the background data, spotting potential errors or potential misinterpretations, and, possibly, to enrich and consolidate the description of an earthquake. This work, carried out with the collaboration of an experienced historical seismologist, investigates the peculiar needs of this field of research and proposes new tools, which are based on a Geographical Information System (GIS). Finally, a prototype system is presented. This solution enables to store, manage and analyse spatial and thematic data related to historical earthquakes, and integrates the relevant data resulting from seismic studies and from their original source documents. In particular, the conceptual model of the GIS spatial database is described and some examples of maps and queries are discussed for a case study represented by two earthquakes which occurred in Locris (Greece) on the 20
th
and 27
th
April 1894.
Journal Article
Detailed dynamic, geometric and supraglacial moraine data for Glaciar Pio XI, the only surge-type glacier of the Southern Patagonia Icefield
2016
In contrast to the general trend for glaciers of the Southern Patagonia Icefield, Glaciar Pio XI has experienced a large cumulative frontal advance since 1945. In an effort to better understand this advancing behaviour, this paper presents a synoptic analysis of frontal fluctuations (1998–2014), ice velocities (1986–2014), ice-surface elevations (1975–2007) and supraglacial moraines (1945–2014) derived from geospatial datasets. These analyses reveal changes in the ice flow of Glaciar Pio XI's freshwater calving northern terminus and tidewater calving southern terminus over recent decades. Between 1986 and 2000, ice flow speed generally accelerated reaching peaks of >15 m d −1 at the frontal edge of the southern terminus. Following this period, flow speed decreased, reducing to <1 m d −1 for the central part of the southern terminus in 2014, despite advancing to a neoglacial maximum. From 2000 to 2014 the reduction in speed was accompanied by a shift in maximum velocity away from the southern terminus, towards the central glacier trunk. As a result, the northern terminus, which accelerated during this period, represented the new primary flow path in 2014. Notably, the moraine maps presented highlight surges occurring around 1981 and again between 1997 and 2000, marked by arcuate moraine features on the southern terminus.
Journal Article