Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
Is Full-Text AvailableIs Full-Text Available
-
YearFrom:-To:
-
More FiltersMore FiltersSubjectCountry Of PublicationPublisherSourceLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
531
result(s) for
"Natural disasters Simulation methods."
Sort by:
Coastal disaster surveys and assessment for risk mitigation
by
Shibayama, Tomoya, editor
,
Esteban, Miguel, editor
in
Coastal engineering.
,
Coastal zone management.
,
Natural disasters Simulation methods.
2023
\"This collection covers the essential concepts in the management of coastal disasters, outlining several field surveys of coastal disasters in the 21st century, including the Indian Ocean and Tohoku Tsunamis, and the storm surges of Hurricane Katrina, Cyclone Nargis, and Typhoon Haiyan. Measurements of flood heights, distributions of structural destruction, and residents' testimonies are reported, and the results are analysed and compared with past events and numerical simulations, with the reality of these disasters reconstructed. The book then covers the current understanding of disaster mechanisms and the most advanced tools for future simulation. Uniquely explains how to use disaster surveys along with simulations to mitigate risk. Combines pure scientific studies with practical trials and proposes future procedures for effective coastal disaster mitigation. Coastal Disaster Surveys and Assessment for Risk Mitigation is ideal for students in the disaster field as well as engineers who manage tsunamis, storm surges, high wave attacks and coastal erosion\"-- Provided by publisher.
Constructing Priors that Penalize the Complexity of Gaussian Random Fields
by
Lindgren, Finn
,
Rue, Håvard
,
Simpson, Daniel
in
Analysis of covariance
,
Annual precipitation
,
atmospheric precipitation
2019
Priors are important for achieving proper posteriors with physically meaningful covariance structures for Gaussian random fields (GRFs) since the likelihood typically only provides limited information about the covariance structure under in-fill asymptotics. We extend the recent penalized complexity prior framework and develop a principled joint prior for the range and the marginal variance of one-dimensional, two-dimensional, and three-dimensional Matérn GRFs with fixed smoothness. The prior is weakly informative and penalizes complexity by shrinking the range toward infinity and the marginal variance toward zero. We propose guidelines for selecting the hyperparameters, and a simulation study shows that the new prior provides a principled alternative to reference priors that can leverage prior knowledge to achieve shorter credible intervals while maintaining good coverage.
We extend the prior to a nonstationary GRF parameterized through local ranges and marginal standard deviations, and introduce a scheme for selecting the hyperparameters based on the coverage of the parameters when fitting simulated stationary data. The approach is applied to a dataset of annual precipitation in southern Norway and the scheme for selecting the hyperparameters leads to conservative estimates of nonstationarity and improved predictive performance over the stationary model. Supplementary materials for this article are available online.
Journal Article
Resilience: An Indicator of Recovery Capability in Intermodal Freight Transport
2012
In this paper, an indicator of network resilience is defined that quantifies the ability of an intermodal freight transport network to recover from disruptions due to natural or human-caused disaster. The indicator considers the network's inherent ability to cope with the negative consequences of disruptions as a result of its topological and operational attributes. Furthermore, the indicator explicitly accounts for the impact of potential recovery activities that might be taken in the immediate aftermath of the disruption to meet target operational service levels while adhering to a fixed budget. A stochastic mixed-integer program is proposed for quantifying network resilience and identifying an optimal postevent course of action (i.e., set of activities) to take. To solve this mathematical program, a technique that accounts for dependencies in random link attributes based on concepts of Benders decomposition, column generation, and Monte Carlo simulation is proposed. Experiments were conducted to illustrate the resilience concept and procedure for its measurement, and to assess the role of network topology in its magnitude.
Journal Article
Technical Support by Smart Glasses During a Mass Casualty Incident: A Randomized Controlled Simulation Trial on Technically Assisted Triage and Telemedical App Use in Disaster Medicine
by
Follmann, Andreas
,
Rossaint, Rolf
,
Hochhausen, Nadine
in
Algorithms
,
Augmentation
,
Augmented Reality
2019
To treat many patients despite lacking personnel resources, triage is important in disaster medicine. Various triage algorithms help but often are used incorrectly or not at all. One potential problem-solving approach is to support triage with Smart Glasses.
In this study, augmented reality was used to display a triage algorithm and telemedicine assistance was enabled to compare the duration and quality of triage with a conventional one.
A specific Android app was designed for use with Smart Glasses, which added information in terms of augmented reality with two different methods-through the display of a triage algorithm in data glasses and a telemedical connection to a senior emergency physician realized by the integrated camera. A scenario was created (ie, randomized simulation study) in which 31 paramedics carried out a triage of 12 patients in 3 groups as follows: without technical support (control group), with a triage algorithm display, and with telemedical contact.
A total of 362 assessments were performed. The accuracy in the control group was only 58%, but the assessments were quicker (on average 16.6 seconds). In contrast, an accuracy of 92% (P=.04) was achieved when using technical support by displaying the triage algorithm. This triaging took an average of 37.0 seconds. The triage group wearing data glasses and being telemedically connected achieved 90% accuracy (P=.01) in 35.0 seconds.
Triage with data glasses required markedly more time. While only a tally was recorded in the control group, Smart Glasses led to digital capture of the triage results, which have many tactical advantages. We expect a high potential in the application of Smart Glasses in disaster scenarios when using telemedicine and augmented reality features to improve the quality of triage.
Journal Article
Detecting and dating structural breaks in functional data without dimension reduction
by
Aue, Alexander
,
Sönmez, Ozan
,
Rice, Gregory
in
Asymptotic methods
,
Change point analysis
,
Computer simulation
2018
Methodology is proposed to uncover structural breaks in functional data that is ‘fully functional’ in the sense that it does not rely on dimension reduction techniques. A thorough asymptotic theory is developed for a fully functional break detection procedure as well as for a break date estimator, assuming a fixed break size and a shrinking break size. The latter result is utilized to derive confidence intervals for the unknown break date. The main results highlight that the fully functional procedures perform best under conditions when analogous estimators based on functional principal component analysis are at their worst, namely when the feature of interest is orthogonal to the leading principal components of the data. The theoretical findings are confirmed by means of a Monte Carlo simulation study in finite samples. An application to annual temperature curves illustrates the practical relevance of the procedures proposed.
Journal Article
Optimal Penalized Function-on-Function Regression Under a Reproducing Kernel Hilbert Space Framework
2018
Many scientific studies collect data where the response and predictor variables are both functions of time, location, or some other covariate. Understanding the relationship between these functional variables is a common goal in these studies. Motivated from two real-life examples, we present in this article a function-on-function regression model that can be used to analyze such kind of functional data. Our estimator of the 2D coefficient function is the optimizer of a form of penalized least squares where the penalty enforces a certain level of smoothness on the estimator. Our first result is the representer theorem which states that the exact optimizer of the penalized least squares actually resides in a data-adaptive finite-dimensional subspace although the optimization problem is defined on a function space of infinite dimensions. This theorem then allows us an easy incorporation of the Gaussian quadrature into the optimization of the penalized least squares, which can be carried out through standard numerical procedures. We also show that our estimator achieves the minimax convergence rate in mean prediction under the framework of function-on-function regression. Extensive simulation studies demonstrate the numerical advantages of our method over the existing ones, where a sparse functional data extension is also introduced. The proposed method is then applied to our motivating examples of the benchmark Canadian weather data and a histone regulation study. Supplementary materials for this article are available online.
Journal Article
Quantifying uncertainty and variable sensitivity within the US billion-dollar weather and climate disaster cost estimates
2015
Research examining natural disaster costs on social and economic systems is substantial. However, there are few empirical studies that seek to quantify the uncertainty and establish confidence intervals surrounding natural disaster cost estimates (ex post). To better frame the data limitations associated with natural disaster loss estimates, a range of losses can be evaluated by conducting multiple analyses and varying certain input parameters to which the losses are most sensitive. This paper contributes to the literature by examining new approaches for better understanding the uncertainty surrounding three US natural disaster cost estimate case studies, via Monte Carlo simulations to quantify the 95, 90 and 75 % confidence intervals. This research also performs a sensitivity analysis for one of the case studies examining which input data variables and assumptions are the most sensitive and contribute most to the overall uncertainty of the estimate. The Monte Carlo simulations for all three of the natural disaster events examined provide additional confidence in the US billion-dollar weather and climate disaster loss estimate report (NCDC
2014
), since these estimates are within the confidence limits and near the mean and median of the example simulations. The normalized sensitivity analysis of Hurricane Ike damage costs determined that commercial losses in Texas are the most sensitive to assumption variability. Therefore, improvements in quantifying the commercial insurance participation rate for Texas will result in the largest reduction of uncertainty in the total loss estimate for Hurricane Ike. Further minimization of uncertainty would continue with improved measurement of subsequent cost parameters in order of descending sensitivity.
Journal Article
Adaptive Bayesian Time-Frequency Analysis of Multivariate Time Series
2019
This article introduces a nonparametric approach to multivariate time-varying power spectrum analysis. The procedure adaptively partitions a time series into an unknown number of approximately stationary segments, where some spectral components may remain unchanged across segments, allowing components to evolve differently over time. Local spectra within segments are fit through Whittle likelihood-based penalized spline models of modified Cholesky components, which provide flexible nonparametric estimates that preserve positive definite structures of spectral matrices. The approach is formulated in a Bayesian framework, in which the number and location of partitions are random, and relies on reversible jump Markov chain and Hamiltonian Monte Carlo methods that can adapt to the unknown number of segments and parameters. By averaging over the distribution of partitions, the approach can approximate both abrupt and slowly varying changes in spectral matrices. Empirical performance is evaluated in simulation studies and illustrated through analyses of electroencephalography during sleep and of the El Niño-Southern Oscillation. Supplementary materials for this article are available online.
Journal Article
Uncertainty Quantification for Computer Models With Spatial Output Using Calibration-Optimal Bases
by
Kharin, Viatcheslav
,
Salter, James M.
,
Scinocca, John
in
Algorithms
,
Bayesian calibration
,
Calibration
2019
The calibration of complex computer codes using uncertainty quantification (UQ) methods is a rich area of statistical methodological development. When applying these techniques to simulators with spatial output, it is now standard to use principal component decomposition to reduce the dimensions of the outputs in order to allow Gaussian process emulators to predict the output for calibration. We introduce the \"terminal case,\" in which the model cannot reproduce observations to within model discrepancy, and for which standard calibration methods in UQ fail to give sensible results. We show that even when there is no such issue with the model, the standard decomposition on the outputs can and usually does lead to a terminal case analysis. We present a simple test to allow a practitioner to establish whether their experiment will result in a terminal case analysis, and a methodology for defining calibration-optimal bases that avoid this whenever it is not inevitable. We present the optimal rotation algorithm for doing this, and demonstrate its efficacy for an idealized example for which the usual principal component methods fail. We apply these ideas to the CanAM4 model to demonstrate the terminal case issue arising for climate models. We discuss climate model tuning and the estimation of model discrepancy within this context, and show how the optimal rotation algorithm can be used in developing practical climate model tuning tools.
Supplementary materials
for this article are available online.
Journal Article
Natural hazards, disaster management and simulation: a bibliometric analysis of keyword searches
2019
Disasters affect millions of people annually, causing large numbers of fatalities, detrimental economic impact and the displacement of communities. Policy-makers, researchers and industry professionals are regularly faced with these consequences and therefore require tools to assess the potential impacts and provide sustainable solutions, often with only very limited information. This paper focuses on the themes of “disaster management”, “natural hazards” and “simulation”, aiming to identify current research trends using bibliometric analysis. This analysis technique combines quantitative and statistical methods to identify these trends, assess quality and measure development. The study has concluded that natural hazards (73%) are more predominant in research than man-made hazards (14%). Of the man-made hazards covered, terrorism is the most prevalent (83%). The most frequent disaster types are climate related, and in this study hydrological (20%), geophysical (20%), meteorological (15%) and climatological (5%) were the most frequently researched. Asia experiences the highest number of disaster events as a continent but in this study was only included in 11% of papers, with North America being the most recurrent (59%). There were some surprising omissions, such as Africa, which did not feature in a single paper. Despite the inclusion of key words “simulation” and “agent based” in the searches, the study did not demonstrate there is a large volume of research being carried out using numerical modelling techniques. Finally, research is appearing to take a reactive rather than proactive approach to disaster management planning, but the merit of this approach is questionable.
Journal Article