Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
13
result(s) for
"Grigoriu, Mircea"
Sort by:
Probabilistic Models for Two-Phase Materials
2025
Level-cut Gaussian/filtered Poisson, mosaic, and Voronoi tessellation random fields are used to model two-phase random materials. Essential properties of these random fields are reviewed and Monte Carlo algorithms for generating synthetic two-phase materials are presented. Numerical examples are used to illustrate the implementation and features of these models for two-phase materials.
Journal Article
Extremal dependence between temperature and ozone over the continental US
by
Phalitnonkiat, Pakawat
,
Beaudry, Ellie
,
Plummer, David
in
Atmospheric chemistry
,
Atmospheric temperature
,
Carbon dioxide
2018
The co-occurrence of heat waves and pollution events and the resulting high mortality rates emphasize the importance of the co-occurrence of pollution and temperature extremes. Through the use of extreme value theory and other statistical methods, tropospheric surface ozone and temperature extremes and their joint occurrence are analyzed over the United States during the summer months (JJA) using measurements and simulations of the present and future climate and chemistry. Five simulations from the Chemistry-Climate Model Initiative (CCMI) reference experiment using specified dynamics (REFC1SD) were analyzed: the CESM1 CAM4-chem, CHASER, CMAM, MOCAGE and MRI-ESM1r1 simulations. In addition, a 25-year present-day simulation branched off the CCMI REFC2 simulation in the year 2000 and a 25-year future simulation branched off the CCMI REFC2 simulation in 2100 were analyzed using CESM1 CAM4-chem. The last two simulations differed in their concentration of carbon dioxide (representative of the years 2000 and 2100) but were otherwise identical. In general, regions with relatively high ozone extremes over the US do not occur in regions of relatively high temperature extremes. A new metric, the spectral density, is developed to measure the joint extremal dependence of ozone and temperature by evaluating the spectral dependence of their extremes. While in many areas of the country ozone and temperature are highly correlated overall, the correlation is significantly reduced when examined on the higher end of the distributions. Measures of spectral density are less than about 0.35 everywhere, suggesting that at most only about a third of the time do extreme temperatures coincide with extreme ozone. Two regions of the US have the strongest measured extreme dependence of ozone and temperature: the northeast and the southeast. The simulated future increase in temperature and ozone is primarily due to a shift in their distributions, not to an increase in their extremes. The locations where the right-hand side of the temperature distribution does increase (by up to 30 %) are consistent with locations where soil–moisture feedback may be expected. Future changes in the right-hand side of the ozone distribution range regionally between +20 % and −10 %. The location of future increases in the high-end tail of the ozone distribution are weakly related to those of temperature with a correlation of 0.3. However, the regions where the temperature extremes increase are not located where the extremes in ozone are large, suggesting a muted ozone response.
Journal Article
Non‐Stationary Probabilistic Tsunami Hazard Assessments Incorporating Climate‐Change‐Driven Sea Level Rise
by
Winckler, Patricio
,
Haase, Jennifer S.
,
Sepúlveda, Ignacio
in
Climate change
,
climate change driven sea level rise and tsunamis
,
Coasts
2021
We face a new era in the assessment of multiple natural hazards whose statistics are becoming alarmingly non‐stationary due to ubiquitous long‐term changes in climate. One particular case is tsunami hazard affected by climate‐change‐driven sea level rise (SLR). A traditional tsunami hazard assessment approach where SLR is omitted or included as a constant sea‐level offset in a probabilistic calculation may misrepresent the impacts of climate‐change. In this paper, a general method called non‐stationary probabilistic tsunami hazard assessment (nPTHA), is developed to include the long‐term time‐varying changes in mean sea level. The nPTHA is based on a non‐stationary Poisson process model, which takes advantage of the independence of arrivals within non‐overlapping time‐intervals to specify a temporally varying hazard mean recurrence rate, affected by SLR. The nPTHA is applied to the South China Sea (SCS) for tsunamis generated by earthquakes in the Manila Subduction Zone. The method provides unique and comprehensive results for inundation hazard, combining tsunami and SLR at a specific location over a given exposure time. The results show that in the SCS, SLR has a significant impact when its amplitude is comparable to that of tsunamis with moderate probability of exceedance. The SLR and its associated uncertainty produce an impact on nPTHA results comparable to that caused by the uncertainty in the earthquake recurrence model. These findings are site‐specific and must be analyzed for different regions. The proposed methodology, however, is sufficiently general to include other non‐stationary phenomena and can be exploited for other hazards affected by SLR. Plain Language Summary Assessing natural hazards that are made worse by climate change cannot use previous methods that assume that the average behavior is a good representation of the hazard. Here we show the effect of climate‐change‐driven sea level rise (SLR) on tsunami hazard, where the continuously increasing SLR cannot be represented by an average value. Higher sea levels produce several changes in the tsunami behavior, including an increase in the maximum tsunami water level and in the speed the tsunami propagates. We introduce a new method which incorporates the long‐term time‐varying changes in mean sea level. The method can be applied to other coastal hazards, such as storm surge and waves. The new method is applied to port cities in the South China Sea (SCS) for tsunamis generated by earthquakes in the Manila Subduction Zone. We determine the probability of flooding urban areas within 50 and 100 years. The hazard in SCS is significantly impacted by SLR when it rises by an amount comparable to the tsunami height for a tsunami with moderate likelihood. The effect is comparable to that caused by the estimated uncertainty in recurrence interval of the causative earthquake. These results, though, are site‐specific. Key Points The impact of sea level rise (SLR) on probabilistic tsunami hazard assessment (PTHA) depends on the exposure time and the relative magnitude of both phenomena For the probabilistic tsunami hazard assessment (PTHA) in South China Sea, the sea level rise (SLR) is as important as the uncertainty of the earthquake recurrence model Sea level rise (SLR) can change the tsunami propagation properties so probabilistic tsunami hazard assessment (PTHA) must include nonlinear effects in the tsunami behavior and inundation level
Journal Article
Temperature Extremes in the Community Atmosphere Model with Stochastic Parameterizations
by
Berner, Judith
,
Tagle, Felipe
,
Grigoriu, Mircea D.
in
Atmosphere
,
Atmospheric models
,
Backscatter
2016
This paper evaluates the performance of the NCARC ommunity Atmosphere Model, version 4 (CAM4), in simulating observed annual extremes of near-surface temperature and provides the first assessment of the impact of stochastic parameterizations of subgrid-scale processes on such performance. Two stochastic parameterizations are examined: the stochastic kinetic energy backscatter scheme and the stochastically perturbed parameterization tendency scheme. Temperature extremes are described in terms of 20-yr return levels and compared to those estimated from ERA-Interim and the Hadley Centre Global Climate Extremes Index 2 (HadEX2) observational dataset. CAM4 overestimates warm and cold extremes over land regions, particularly over the Northern Hemisphere, when compared against reanalysis. Similar spatial patterns, though less spatially coherent, emerge relative to HadEX2. The addition of a stochastic parameterization generally produces a warming of both warm and cold extremes relative to the unperturbed configuration; however, neither of the proposed parameterizations meaningfully reduces the biases in the simulated temperature extremes of CAM4. Adjusting warm and cold extremes by mean conditions in the respective annual extremes leads to good agreement between the models and reanalysis; however, adjusting for the bias in mean temperature does not help to reduce the observed discrepancies. Based on the behavior of the annual extremes, this study concludes that the distribution of temperature in CAM4 exhibits too much variability relative to that of reanalysis, while the stochastic parameterizations introduce a systematic bias in its mean rather than alter its variability.
Journal Article
Non‐Stationary Probabilistic Tsunami Hazard Assessments Compounding Tides and Sea Level Rise
by
Winckler, Patricio
,
Haase, Jennifer S.
,
Sepúlveda, Ignacio
in
Bathymetry
,
Coastal zone
,
Collocation
2022
Tides are often the largest source of sea levels fluctuations. Two new probabilistic tsunami hazard assessments (PTHA) methods are proposed to combine the tidal phase uncertainty at the moment of tsunami occurrence with other sources of uncertainty. The first method adopts a Stochastic Reduced Order Model (SROM) producing sets of tidal phase samples to be used in tsunami simulations. The second method uses tsunami simulations with prescribed collocation tidal phases and tide probability distributions to model the uncertainty. The methods are extended to non‐stationary probabilistic tsunami hazard assessment, compounding tsunamis, tides and sea level rise (SLR). As an illustration, these methods are applied for assessing tsunamis generated in the Manila Subduction Zone, on the coasts of Kao Hsiung and Hong Kong. While the SROM‐based method is faster solving for the PTHA if only tides are considered, the collocation‐based method is faster when both SLR and tides are considered. For the illustration case, tides have a relevant impact on PTHA results, however, the SLR within an exposure time of 100 years has stronger impact. PTHA curves of the maximum tsunami elevation are affected by tides and SLR differently. While tides and SLR increase the dispersion of PTHA hazard curve distributions, the latter also produces a translation toward higher elevations. The development of formulations based on SROM or collocation tides is the key to establishing a method which feasibly can be applied to other regions for comprehensive analysis at a global scale. Plain Language Summary Tsunami hazards can be evaluated using a probabilistic tsunami hazard assessment (PTHA) approach which determines probabilities of exceeding a certain tsunami intensity. A relevant source of uncertainty in PTHA are tides. Tsunamis striking the coast may have different impacts depending on whether they arrive at high or low tide. Moreover, the exposed infrastructure may last many decades and climate‐change‐driven sea‐level‐rise would compound tsunamis and tides, making the hazard worse. We develop two PTHA methods incorporating tides. The first method uses a stochastic reduced order model (SROM). The second method uses sets of tsunami simulations with same tide, known as collocation tidal phases. Both methods aim to estimate tsunami uncertainties and can be also used for non‐stationary probabilistic tsunami hazard assessment which combines tsunamis, tides and sea level rise (SLR). The new methods are applied into an illustration case in South China Sea. The method based on SROM is fastest solving the PTHA with tides. Though, the collocation‐based method is fastest solving for the nPTHA with tides and SLR. While the impact of tides is relevant in PTHA and nPTHA results, SLR produces a greater effect in the studied ports. The conclusions are site‐specific. Key Points A Stochastic Reduced Order Model (SROM)‐based method and a collocation‐based method incorporating tides and sea level rise (SLR) are evaluated for tsunami hazard assessments The collocation‐based method provides accurate results compared to the SROM‐based method The impact of tides and SLR on tsunami intensities depends on the tidal range and SLR within the exposure time, respectively
Journal Article
An earthquake-source-based metric for seismic fragility analysis
2018
The seismic fragility of a system is the probability that the system enters a damage state under seismic ground motions with specified characteristics. Plots of the seismic fragilities with respect to scalar ground motion intensity measures are called fragility curves. Recent studies show that fragility curves may not be satisfactory measures for structural seismic performance, since scalar intensity measures cannot comprehensively characterize site seismicity. The limitations of traditional seismic intensity measures, e.g., peak ground acceleration or pseudo-spectral acceleration, are shown and discussed in detail. A bivariate vector with coordinates moment magnitude m and source-to-site distance r is proposed as an alternative seismic intensity measure. Implicitly, fragility surfaces in the (m, r)-space could be used as graphical representations of seismic fragility. Unlike fragility curves, which are functions of scalar intensity measures, fragility surfaces are characterized by two earthquake-hazard parameters, (m, r). The calculation of fragility surfaces may be computationally expensive for complex systems. Thus, as solutions to this issue, a bi-variate log-normal parametric model and an efficient calculation method, based on stochastic-reduced-order models, for fragility surfaces are proposed.
Journal Article
Dynamic Systems with Poisson White Noise
2004
Methods are developed for finding properties of the output of linear and nonlinear dynamic systems to random actions represented by Poisson white noise and filtered Poisson processes. The Poisson white noise can be viewed as a sequence of independent, identically distributed pulses arriving at random times. The filtered Poisson process is the output of a linear filter to Poisson white noise. Three methods are considered for finding output properties. If the input has infrequent or frequent pulses, output properties can be obtained from a Markov model or the assumption that the input is a Gaussian white noise, respectively. Otherwise, a method based on Itô's formula for semimartingales is used to find output properties. Examples are used to illustrate the proposed methods.
Journal Article
Extremes of vector-valued processes by finite dimensional models
2023
Finite dimensional (FD) models, i.e., deterministic functions of time/space and finite sets of random variables, are constructed for target vector-valued random processes/fields. They are required to have two properties. First, standard Monte Carlo algorithms can be used to generate their samples, referred to as FD samples. Second, under some conditions specified by several theorems, FD samples can be used to estimate distributions of extremes and other functionals of target random functions. Numerical illustrations involving two-dimensional random processes and apparent properties of random microstructures are presented to illustrate the implementation of FD models for these stochastic problems and show that they are accurate if the conditions of our theorems are satisfied.
Specification of additional information for solving stochastic inverse problems
by
Grigoriu, Mircea D
,
Uy, Wayne Isaac T
in
Bayesian analysis
,
Differential equations
,
Identification methods
2020
Methods have been developed to identify the probability distribution of a random vector \\(Z\\) from information consisting of its bounded range and the probability density function or moments of a quantity of interest, \\(Q(Z)\\). The mapping from \\(Z\\) to \\(Q(Z)\\) may arise from a stochastic differential equation whose coefficients depend on \\(Z\\). This problem differs from Bayesian inverse problems as the latter is primarily driven by observation noise. We motivate this work by demonstrating that additional information on \\(Z\\) is required to recover its true law. Our objective is to identify what additional information on \\(Z\\) is needed and propose methods to recover the law of \\(Z\\) under such information. These methods employ tools such as Bayes' theorem, principle of maximum entropy, and forward uncertainty quantification to obtain solutions to the inverse problem that are consistent with information on \\(Z\\) and \\(Q(Z)\\). The additional information on \\(Z\\) may include its moments or its family of distributions. We justify our objective by considering the capabilities of solutions to this inverse problem to predict the probability law of unobserved quantities of interest.
Stochastic Properties of Static Friction
by
Albertini, Gabriele
,
Karrer, Simon
,
Grigoriu, Mircea D
in
Coefficient of friction
,
Computer simulation
,
Convolution
2020
The onset of frictional motion is mediated by rupture-like slip fronts, which nucleate locally and propagate eventually along the entire interface causing global sliding. The static friction coefficient is a macroscopic measure of the applied force at this particular instant when the frictional interface loses stability. However, experimental studies are known to present important scatter in the measurement of static friction; the origin of which remains unexplained. Here, we study the nucleation of local slip at interfaces with slip-weakening friction of random strength and analyze the resulting variability in the measured global strength. Using numerical simulations that solve the elastodynamic equations, we observe that multiple slip patches nucleate simultaneously, many of which are stable and grow only slowly, but one reaches a critical length and starts propagating dynamically. We show that a theoretical criterion based on a static equilibrium solution predicts quantitatively well the onset of frictional sliding. We develop a Monte-Carlo model by adapting the theoretical criterion and pre-computing modal convolution terms, which enables us to run efficiently a large number of samples and to study variability in global strength distribution caused by the stochastic properties of local frictional strength. The results demonstrate that an increasing spatial correlation length on the interface, representing geometric imperfections and roughness, causes lower global static friction. Conversely, smaller correlation length increases the macroscopic strength while its variability decreases. We further show that randomness in local friction properties is insufficient for the existence of systematic precursory slip events. Random or systematic non-uniformity in the driving force, such as potential energy or stress drop, is required for arrested slip fronts. Our model and observations...