Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
Is Full-Text AvailableIs Full-Text Available
-
YearFrom:-To:
-
More FiltersMore FiltersSubjectCountry Of PublicationPublisherSourceLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
620
result(s) for
"Natural disasters Simulation methods."
Sort by:
Coastal disaster surveys and assessment for risk mitigation
by
Shibayama, Tomoya, editor
,
Esteban, Miguel, editor
in
Coastal engineering.
,
Coastal zone management.
,
Natural disasters Simulation methods.
2023
\"This collection covers the essential concepts in the management of coastal disasters, outlining several field surveys of coastal disasters in the 21st century, including the Indian Ocean and Tohoku Tsunamis, and the storm surges of Hurricane Katrina, Cyclone Nargis, and Typhoon Haiyan. Measurements of flood heights, distributions of structural destruction, and residents' testimonies are reported, and the results are analysed and compared with past events and numerical simulations, with the reality of these disasters reconstructed. The book then covers the current understanding of disaster mechanisms and the most advanced tools for future simulation. Uniquely explains how to use disaster surveys along with simulations to mitigate risk. Combines pure scientific studies with practical trials and proposes future procedures for effective coastal disaster mitigation. Coastal Disaster Surveys and Assessment for Risk Mitigation is ideal for students in the disaster field as well as engineers who manage tsunamis, storm surges, high wave attacks and coastal erosion\"-- Provided by publisher.
Constructing Priors that Penalize the Complexity of Gaussian Random Fields
by
Lindgren, Finn
,
Rue, Håvard
,
Simpson, Daniel
in
Analysis of covariance
,
Annual precipitation
,
atmospheric precipitation
2019
Priors are important for achieving proper posteriors with physically meaningful covariance structures for Gaussian random fields (GRFs) since the likelihood typically only provides limited information about the covariance structure under in-fill asymptotics. We extend the recent penalized complexity prior framework and develop a principled joint prior for the range and the marginal variance of one-dimensional, two-dimensional, and three-dimensional Matérn GRFs with fixed smoothness. The prior is weakly informative and penalizes complexity by shrinking the range toward infinity and the marginal variance toward zero. We propose guidelines for selecting the hyperparameters, and a simulation study shows that the new prior provides a principled alternative to reference priors that can leverage prior knowledge to achieve shorter credible intervals while maintaining good coverage.
We extend the prior to a nonstationary GRF parameterized through local ranges and marginal standard deviations, and introduce a scheme for selecting the hyperparameters based on the coverage of the parameters when fitting simulated stationary data. The approach is applied to a dataset of annual precipitation in southern Norway and the scheme for selecting the hyperparameters leads to conservative estimates of nonstationarity and improved predictive performance over the stationary model. Supplementary materials for this article are available online.
Journal Article
Resilience: An Indicator of Recovery Capability in Intermodal Freight Transport
2012
In this paper, an indicator of network resilience is defined that quantifies the ability of an intermodal freight transport network to recover from disruptions due to natural or human-caused disaster. The indicator considers the network's inherent ability to cope with the negative consequences of disruptions as a result of its topological and operational attributes. Furthermore, the indicator explicitly accounts for the impact of potential recovery activities that might be taken in the immediate aftermath of the disruption to meet target operational service levels while adhering to a fixed budget. A stochastic mixed-integer program is proposed for quantifying network resilience and identifying an optimal postevent course of action (i.e., set of activities) to take. To solve this mathematical program, a technique that accounts for dependencies in random link attributes based on concepts of Benders decomposition, column generation, and Monte Carlo simulation is proposed. Experiments were conducted to illustrate the resilience concept and procedure for its measurement, and to assess the role of network topology in its magnitude.
Journal Article
Technical Support by Smart Glasses During a Mass Casualty Incident: A Randomized Controlled Simulation Trial on Technically Assisted Triage and Telemedical App Use in Disaster Medicine
by
Follmann, Andreas
,
Rossaint, Rolf
,
Hochhausen, Nadine
in
Algorithms
,
Augmentation
,
Augmented Reality
2019
To treat many patients despite lacking personnel resources, triage is important in disaster medicine. Various triage algorithms help but often are used incorrectly or not at all. One potential problem-solving approach is to support triage with Smart Glasses.
In this study, augmented reality was used to display a triage algorithm and telemedicine assistance was enabled to compare the duration and quality of triage with a conventional one.
A specific Android app was designed for use with Smart Glasses, which added information in terms of augmented reality with two different methods-through the display of a triage algorithm in data glasses and a telemedical connection to a senior emergency physician realized by the integrated camera. A scenario was created (ie, randomized simulation study) in which 31 paramedics carried out a triage of 12 patients in 3 groups as follows: without technical support (control group), with a triage algorithm display, and with telemedical contact.
A total of 362 assessments were performed. The accuracy in the control group was only 58%, but the assessments were quicker (on average 16.6 seconds). In contrast, an accuracy of 92% (P=.04) was achieved when using technical support by displaying the triage algorithm. This triaging took an average of 37.0 seconds. The triage group wearing data glasses and being telemedically connected achieved 90% accuracy (P=.01) in 35.0 seconds.
Triage with data glasses required markedly more time. While only a tally was recorded in the control group, Smart Glasses led to digital capture of the triage results, which have many tactical advantages. We expect a high potential in the application of Smart Glasses in disaster scenarios when using telemedicine and augmented reality features to improve the quality of triage.
Journal Article
Detecting and dating structural breaks in functional data without dimension reduction
by
Aue, Alexander
,
Sönmez, Ozan
,
Rice, Gregory
in
Asymptotic methods
,
Change point analysis
,
Computer simulation
2018
Methodology is proposed to uncover structural breaks in functional data that is ‘fully functional’ in the sense that it does not rely on dimension reduction techniques. A thorough asymptotic theory is developed for a fully functional break detection procedure as well as for a break date estimator, assuming a fixed break size and a shrinking break size. The latter result is utilized to derive confidence intervals for the unknown break date. The main results highlight that the fully functional procedures perform best under conditions when analogous estimators based on functional principal component analysis are at their worst, namely when the feature of interest is orthogonal to the leading principal components of the data. The theoretical findings are confirmed by means of a Monte Carlo simulation study in finite samples. An application to annual temperature curves illustrates the practical relevance of the procedures proposed.
Journal Article
Optimal Penalized Function-on-Function Regression Under a Reproducing Kernel Hilbert Space Framework
2018
Many scientific studies collect data where the response and predictor variables are both functions of time, location, or some other covariate. Understanding the relationship between these functional variables is a common goal in these studies. Motivated from two real-life examples, we present in this article a function-on-function regression model that can be used to analyze such kind of functional data. Our estimator of the 2D coefficient function is the optimizer of a form of penalized least squares where the penalty enforces a certain level of smoothness on the estimator. Our first result is the representer theorem which states that the exact optimizer of the penalized least squares actually resides in a data-adaptive finite-dimensional subspace although the optimization problem is defined on a function space of infinite dimensions. This theorem then allows us an easy incorporation of the Gaussian quadrature into the optimization of the penalized least squares, which can be carried out through standard numerical procedures. We also show that our estimator achieves the minimax convergence rate in mean prediction under the framework of function-on-function regression. Extensive simulation studies demonstrate the numerical advantages of our method over the existing ones, where a sparse functional data extension is also introduced. The proposed method is then applied to our motivating examples of the benchmark Canadian weather data and a histone regulation study. Supplementary materials for this article are available online.
Journal Article
Quantifying uncertainty and variable sensitivity within the US billion-dollar weather and climate disaster cost estimates
2015
Research examining natural disaster costs on social and economic systems is substantial. However, there are few empirical studies that seek to quantify the uncertainty and establish confidence intervals surrounding natural disaster cost estimates (ex post). To better frame the data limitations associated with natural disaster loss estimates, a range of losses can be evaluated by conducting multiple analyses and varying certain input parameters to which the losses are most sensitive. This paper contributes to the literature by examining new approaches for better understanding the uncertainty surrounding three US natural disaster cost estimate case studies, via Monte Carlo simulations to quantify the 95, 90 and 75 % confidence intervals. This research also performs a sensitivity analysis for one of the case studies examining which input data variables and assumptions are the most sensitive and contribute most to the overall uncertainty of the estimate. The Monte Carlo simulations for all three of the natural disaster events examined provide additional confidence in the US billion-dollar weather and climate disaster loss estimate report (NCDC
2014
), since these estimates are within the confidence limits and near the mean and median of the example simulations. The normalized sensitivity analysis of Hurricane Ike damage costs determined that commercial losses in Texas are the most sensitive to assumption variability. Therefore, improvements in quantifying the commercial insurance participation rate for Texas will result in the largest reduction of uncertainty in the total loss estimate for Hurricane Ike. Further minimization of uncertainty would continue with improved measurement of subsequent cost parameters in order of descending sensitivity.
Journal Article
Accuracy of a Commercial Large Language Model (ChatGPT) to Perform Disaster Triage of Simulated Patients Using the Simple Triage and Rapid Treatment (START) Protocol: Gage Repeatability and Reproducibility Study
by
Hertelendy, Attila Julius
,
Franc, Jeffrey Micheal
,
Verde, Manuela
in
Disaster Medicine - methods
,
Disasters
,
Humans
2024
The release of ChatGPT (OpenAI) in November 2022 drastically reduced the barrier to using artificial intelligence by allowing a simple web-based text interface to a large language model (LLM). One use case where ChatGPT could be useful is in triaging patients at the site of a disaster using the Simple Triage and Rapid Treatment (START) protocol. However, LLMs experience several common errors including hallucinations (also called confabulations) and prompt dependency.
This study addresses the research problem: \"Can ChatGPT adequately triage simulated disaster patients using the START protocol?\" by measuring three outcomes: repeatability, reproducibility, and accuracy.
Nine prompts were developed by 5 disaster medicine physicians. A Python script queried ChatGPT Version 4 for each prompt combined with 391 validated simulated patient vignettes. Ten repetitions of each combination were performed for a total of 35,190 simulated triages. A reference standard START triage code for each simulated case was assigned by 2 disaster medicine specialists (JMF and MV), with a third specialist (LC) added if the first two did not agree. Results were evaluated using a gage repeatability and reproducibility study (gage R and R). Repeatability was defined as variation due to repeated use of the same prompt. Reproducibility was defined as variation due to the use of different prompts on the same patient vignette. Accuracy was defined as agreement with the reference standard.
Although 35,102 (99.7%) queries returned a valid START score, there was considerable variability. Repeatability (use of the same prompt repeatedly) was 14% of the overall variation. Reproducibility (use of different prompts) was 4.1% of the overall variation. The accuracy of ChatGPT for START was 63.9% with a 32.9% overtriage rate and a 3.1% undertriage rate. Accuracy varied by prompt with a maximum of 71.8% and a minimum of 46.7%.
This study indicates that ChatGPT version 4 is insufficient to triage simulated disaster patients via the START protocol. It demonstrated suboptimal repeatability and reproducibility. The overall accuracy of triage was only 63.9%. Health care professionals are advised to exercise caution while using commercial LLMs for vital medical determinations, given that these tools may commonly produce inaccurate data, colloquially referred to as hallucinations or confabulations. Artificial intelligence-guided tools should undergo rigorous statistical evaluation-using methods such as gage R and R-before implementation into clinical settings.
Journal Article
Adaptive Bayesian Time-Frequency Analysis of Multivariate Time Series
2019
This article introduces a nonparametric approach to multivariate time-varying power spectrum analysis. The procedure adaptively partitions a time series into an unknown number of approximately stationary segments, where some spectral components may remain unchanged across segments, allowing components to evolve differently over time. Local spectra within segments are fit through Whittle likelihood-based penalized spline models of modified Cholesky components, which provide flexible nonparametric estimates that preserve positive definite structures of spectral matrices. The approach is formulated in a Bayesian framework, in which the number and location of partitions are random, and relies on reversible jump Markov chain and Hamiltonian Monte Carlo methods that can adapt to the unknown number of segments and parameters. By averaging over the distribution of partitions, the approach can approximate both abrupt and slowly varying changes in spectral matrices. Empirical performance is evaluated in simulation studies and illustrated through analyses of electroencephalography during sleep and of the El Niño-Southern Oscillation. Supplementary materials for this article are available online.
Journal Article
Natural hazards, disaster management and simulation: a bibliometric analysis of keyword searches
2019
Disasters affect millions of people annually, causing large numbers of fatalities, detrimental economic impact and the displacement of communities. Policy-makers, researchers and industry professionals are regularly faced with these consequences and therefore require tools to assess the potential impacts and provide sustainable solutions, often with only very limited information. This paper focuses on the themes of “disaster management”, “natural hazards” and “simulation”, aiming to identify current research trends using bibliometric analysis. This analysis technique combines quantitative and statistical methods to identify these trends, assess quality and measure development. The study has concluded that natural hazards (73%) are more predominant in research than man-made hazards (14%). Of the man-made hazards covered, terrorism is the most prevalent (83%). The most frequent disaster types are climate related, and in this study hydrological (20%), geophysical (20%), meteorological (15%) and climatological (5%) were the most frequently researched. Asia experiences the highest number of disaster events as a continent but in this study was only included in 11% of papers, with North America being the most recurrent (59%). There were some surprising omissions, such as Africa, which did not feature in a single paper. Despite the inclusion of key words “simulation” and “agent based” in the searches, the study did not demonstrate there is a large volume of research being carried out using numerical modelling techniques. Finally, research is appearing to take a reactive rather than proactive approach to disaster management planning, but the merit of this approach is questionable.
Journal Article