Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
1,840
result(s) for
"uncertainty quantification"
Sort by:
Solving Stochastic Inverse Problems for Property–Structure Linkages Using Data-Consistent Inversion and Machine Learning
by
Wildey, Tim
,
Tran, Anh
in
Alloys
,
Aluminum
,
Augmenting Physics-based Models in ICME with Machine Learning and Uncertainty Quantification
2021
Determining process–structure–property linkages is one of the key objectives in material science, and uncertainty quantification plays a critical role in understanding both process–structure and structure–property linkages. In this work, we seek to learn a distribution of microstructure parameters that are consistent in the sense that the forward propagation of this distribution through a crystal plasticity finite element model matches a target distribution on materials properties. This stochastic inversion formulation infers a distribution of acceptable/consistent microstructures, as opposed to a deterministic solution, which expands the range of feasible designs in a probabilistic manner. To solve this stochastic inverse problem, we employ a recently developed uncertainty quantification framework based on push-forward probability measures, which combines techniques from measure theory and Bayes’ rule to define a unique and numerically stable solution. This approach requires making an initial prediction using an initial guess for the distribution on model inputs and solving a stochastic forward problem. To reduce the computational burden in solving both stochastic forward and stochastic inverse problems, we combine this approach with a machine learning Bayesian regression model based on Gaussian processes and demonstrate the proposed methodology on two representative case studies in structure–property linkages.
Journal Article
On the Brittleness of Bayesian Inference
2015
With the advent of high-performance computing, Bayesian methods are becoming increasingly popular tools for the quantification of uncertainty throughout science and industry. Since these methods can impact the making of sometimes critical decisions in increasingly complicated contexts, the sensitivity of their posterior conclusions with respect to the underlying models and prior beliefs is a pressing question to which there currently exist positive and negative answers. We report new results suggesting that, although Bayesian methods are robust when the number of possible outcomes is finite or when only a finite number of marginals of the data-generating distribution are unknown, they could be generically brittle when applied to continuous systems (and their discretizations) with finite information on the data-generating distribution. If closeness is defined in terms of the total variation (TV) metric or the matching of a finite system of generalized moments, then (1) two practitioners who use arbitrarily close models and observe the same (possibly arbitrarily large amount of) data may reach opposite conclusions; and (2) any given prior and model can be slightly perturbed to achieve any desired posterior conclusion. The mechanism causing brittleness/robustness suggests that learning and robustness are antagonistic requirements, which raises the possibility of a missing stability condition when using Bayesian inference in a continuous world under finite information.
Journal Article
Calibrated explanations for regression
by
Löfström, Tuwe
,
Sönströd, Cecilia
,
Johansson, Ulf
in
Artificial Intelligence
,
Calibrated explanations
,
Calibration
2025
Artificial Intelligence (AI) methods are an integral part of modern decision support systems. The best-performing predictive models used in AI-based decision support systems lack transparency. Explainable Artificial Intelligence (XAI) aims to create AI systems that can explain their rationale to human users. Local explanations in XAI can provide information about the causes of individual predictions in terms of feature importance. However, a critical drawback of existing local explanation methods is their inability to quantify the uncertainty associated with a feature’s importance. This paper introduces an extension of a feature importance explanation method, Calibrated Explanations, previously only supporting classification, with support for standard regression and probabilistic regression, i.e., the probability that the target is below an arbitrary threshold. The extension for regression keeps all the benefits of Calibrated Explanations, such as calibration of the prediction from the underlying model with confidence intervals, uncertainty quantification of feature importance, and allows both factual and counterfactual explanations. Calibrated Explanations for regression provides fast, reliable, stable, and robust explanations. Calibrated Explanations for probabilistic regression provides an entirely new way of creating probabilistic explanations from any ordinary regression model, allowing dynamic selection of thresholds. The method is model agnostic with easily understood conditional rules. An implementation in Python is freely available on GitHub and for installation using both
pip
and
conda
, making the results in this paper easily replicable.
Journal Article
Developing hazard scenarios from monitoring data, historical chronicles, and expert elicitation: a case study of Sangay volcano, Ecuador
by
Bernard, Benjamin
,
Hidalgo, Silvana
,
Tadini, Alessandro
in
Archives & records
,
Earth and Environmental Science
,
Earth Sciences
2024
Sangay volcano is considered as one of the most active volcanoes worldwide. Nevertheless, due to its remote location and low-impact eruptions, its eruptive history and hazard scenarios are poorly constrained. In this work, we address this issue by combining an analysis of monitoring data and historical chronicles with expert elicitation. During the last 400 years, we recognize periods of quiescence, weak, and enhanced eruptive activity, lasting from several months to several years, punctuated by eruptive pulses, lasting from a few hours to a few days. Sangay volcano has been mainly active since the seventeenth century, with weak eruptive activity as the most common regime, although there have also been several periods of quiescence. During this period, eruptive pulses with VEI 1–3 occurred mainly during enhanced eruptive activity and produced far-reaching impacts due to ash fallout to the west and long-runout lahars to the south-east. Four eruptive pulse scenarios are considered in the expert elicitation: strong ash venting (SAV, VEI 1–2), violent Strombolian (VS, VEI 2–3), sub-Plinian (SPL, VEI 3–4), and Plinian (PL, VEI 4–5). SAV is identified as the most likely scenario, while PL has the smallest probability of occurrence. The elicitation results show high uncertainty about the probability of occurrence of VS and SPL. Large uncertainties are also observed for eruption duration and bulk fallout volume for all eruptive scenarios, while average column height is better characterized, particularly for SAV and VS. We interpret these results as a consequence of the lack of volcano-physical data, which could be reduced with further field studies. This study shows how historical reconstruction and expert elicitation can help to develop hazard scenarios with uncertainty assessment for poorly known volcanoes, representing a first step towards the elaboration of appropriate hazard maps and subsequent planning.
Journal Article
Uncertainty Quantification of Material Properties in Ballistic Impact of Magnesium Alloys
2022
The design and development of cutting-edge light materials for extreme conditions including high-speed impact remains a continuing and significant challenge in spite of steady advances. Magnesium (Mg) and its alloys have gained much attention, due to their high strength-to-weight ratio and potential of further improvements in material properties such as strength and ductility. In this paper, a recently developed computational framework is adopted to quantify the effects of material uncertainties on the ballistic performance of Mg alloys. The framework is able to determine the largest deviation in the performance measure resulting from a finite variation in the corresponding material properties. It can also provide rigorous upper bounds on the probability of failure using known information about uncertainties and the system, and then conservative safety design and certification can be achieved. This work specifically focuses on AZ31B Mg alloys, and it is assumed that the material is well-characterized by the Johnson–Cook constitutive and failure models, but the model parameters are uncertain. The ordering of uncertainty contributions for model parameters and the corresponding behavior regimes where those parameters play a crucial role are determined. Finally, it is shown that how this ordering provides insight on the improvement of ballistic performance and the development of new material models for Mg alloys.
Journal Article
Two sources of uncertainty in estimating tephra volumes from isopachs: perspectives and quantification
by
Yang, Qingyuan
,
Jenkins, Susanna F.
in
Earth and Environmental Science
,
Earth Sciences
,
Geology
2023
Calculating the tephra volume is important for estimating eruption intensity and magnitude. Traditionally, tephra volumes are estimated by integrating the area under curves fit to the square root of isopach areas. In this work, we study two sources of uncertainty in estimating tephra volumes based on isopachs. The first is model uncertainty. It occurs because no fitted curves perfectly describe the tephra thinning pattern, and the fitting is done based on log-transformed square root of isopach area. The second source of uncertainty occurs because thickness must be extrapolated beyond the available data, which makes it impossible to validate the extrapolated thickness. We demonstrate the importance of the two sources of uncertainty on a theoretical level. We use six isopach datasets with different characteristics to demonstrate their presence and the effect they could have on volume estimation. Measures to better represent the uncertainty are proposed and tested. For the model uncertainty, we propose (i) a better-informed and stricter way to report and evaluate goodness-of-fit, and (ii) that uncertainty estimations be based on the envelope defined by different well-fitted curves, rather than volumes estimated from individual curves. For the second source of uncertainty, we support reporting separately the volume portions that are interpolated and extrapolated, and we propose to test how sensitive the total volume is to variability in the extrapolated volume. The two sources of uncertainty should not be ignored as they could introduce additional bias and uncertainty in the volume estimate.
Journal Article
Metamodeling on uncertainty quantification in the behavior of the tire/road interaction of vehicles
by
Santos, Vinicius Ramos Israel
,
Santos, Fabio Lúcio
,
Scinocca, Francisco
in
Ambient temperature
,
Coefficient of friction
,
Lateral forces
2024
In recent years, the automotive industry has been developing applied research to meet customer’s needs; considering safety, vehicle comfort and energetic efficiency. In particular, automotive tires have a prominent position in this research area, ensuring good vehicle handling, comfort and safety. In the vehicle dynamics performance, excellent gripping and reduced rolling resistance in the tires are crucial to maximize the energetic efficiency in a reliable way. However, to ensure such a level of reliability in vehicle operation, the inherent uncertainties of tires, as well as other factors subject to variability must be taken into account in the vehicle design. In this way, the present paper analyzes in detail the effect of variations in some parameters such as ambient temperature, ground conditions, vertical load, speed and tire inflation on the analysis of vehicle dynamics. A Metamodeling approach associated with the Monte Carlo Simulation was employed to develop the mathematical models to analyze the effect of uncertain parameters on the tire rolling resistance; traction, centripetal and lateral forces, using experimental data from the literature, in the longitudinal and lateral vehicle dynamics. Therefore, the present research brings as an innovation an integrated approach to the input parameters of the system with the rolling resistance through the developed metamodels. There was a substantial variability of up to 15% both up and down in the Maximum Traction Force of a vehicle in response to variations in the vehicle’s weight and the coefficient of tire rolling resistance. In contrast, the Lateral Force exhibited a greater variability, with a 25 downward and 10% upward variation associated with the weight and friction coefficient variability of the vehicle. Further investigations into the sensitivity analysis highlight the significant influence of the friction coefficient and temperature on the Traction Forces of the vehicle.
Journal Article
Bayesian Parameter Determination of a CT-Test Described by a Viscoplastic-Damage Model Considering the Model Error
by
Matthies, Hermann G.
,
Dinkler, Dieter
,
Adeli, Ehsan
in
Bayesian parameter and damage identification
,
functional approximation
,
health monitoring
2020
The state of materials and accordingly the properties of structures are changing over the period of use, which may influence the reliability and quality of the structure during its life-time. Therefore identification of the model parameters of the system is a topic which has attracted attention in the content of structural health monitoring. The parameters of a constitutive model are usually identified by minimization of the difference between model response and experimental data. However, the measurement errors and differences in the specimens lead to deviations in the determined parameters. In this article, the Choboche model with a damage is used and a stochastic simulation technique is applied to generate artificial data which exhibit the same stochastic behavior as experimental data. Then the model and damage parameters are identified by applying the sequential Gauss-Markov-Kalman filter (SGMKF) approach as this method is determined as the most efficient method for time consuming finite element model updating problems among filtering and random walk approaches. The parameters identified using this Bayesian approach are compared with the true parameters in the simulation, and further, the efficiency of the identification method is discussed. The aim of this study is to observe whether the mentioned method is suitable and efficient to identify the model and damage parameters of a material model, as a highly non-linear model, for a real structural specimen using a limited surface displacement measurement vector gained by Digital Image Correlation (DIC) and to see how much information is indeed needed to estimate the parameters accurately even by considering the model error and whether this approach can also practically be used for health monitoring purposes before the occurrence of severe damage and collapse.
Journal Article
A Nonlinear Approach in the Quantification of Numerical Uncertainty by High-Order Methods for Compressible Turbulence with Shocks
2024
This is a comprehensive overview on our research work to link interdisciplinary modeling and simulation techniques to improve the predictability and reliability simulations (PARs) of compressible turbulence with shock waves for general audiences who are not familiar with our nonlinear approach. This focused nonlinear approach is to integrate our “nonlinear dynamical approach” with our “newly developed high order entropy-conserving, momentum-conserving and kinetic energy-preserving methods” in the quantification of numerical uncertainty in highly nonlinear flow simulations. The central issue is that the solution space of discrete genuinely nonlinear systems is much larger than that of the corresponding genuinely nonlinear continuous systems, thus obtaining numerical solutions that might not be solutions of the continuous systems. Traditional uncertainty quantification (UQ) approaches in numerical simulations commonly employ linearized analysis that might not provide the true behavior of genuinely nonlinear physical fluid flows. Due to the rapid development of high-performance computing, the last two decades have been an era when computation is ahead of analysis and when very large-scale practical computations are increasingly used in poorly understood multiscale data-limited complex nonlinear physical problems and non-traditional fields. This is compounded by the fact that the numerical schemes used in production computational fluid dynamics (CFD) computer codes often do not take into consideration the genuinely nonlinear behavior of numerical methods for more realistic modeling and simulations. Often, the numerical methods used might have been developed for weakly nonlinear flow or different flow types other than the flow being investigated. In addition, some of these methods are not discretely physics-preserving (structure-preserving); this includes but is not limited to entropy-conserving, momentum-conserving and kinetic energy-preserving methods. Employing theories of nonlinear dynamics to guide the construction of more appropriate, stable and accurate numerical methods could help, e.g., (a) delineate solutions of the discretized counterparts but not solutions of the governing equations; (b) prevent numerical chaos or numerical “turbulence” leading to FALSE predication of transition to turbulence; (c) provide more reliable numerical simulations of nonlinear fluid dynamical systems, especially by direct numerical simulations (DNS), large eddy simulations (LES) and implicit large eddy simulations (ILES) simulations; and (d) prevent incorrect computed shock speeds for problems containing stiff nonlinear source terms, if present. For computation intensive turbulent flows, the desirable methods should also be efficient and exhibit scalable parallelism for current high-performance computing. Selected numerical examples to illustrate the genuinely nonlinear behavior of numerical methods and our integrated approach to improve PARs are included.
Journal Article
Orientation Uncertainty of Structures Measured in Cored Boreholes: Methodology and Case Study of Swedish Crystalline Rock
2016
Many engineering applications in fractured crystalline rocks use measured orientations of structures such as rock contact and fractures, and lineated objects such as foliation and rock stress, mapped in boreholes as their foundation. Despite that these measurements are afflicted with uncertainties, very few attempts to quantify their magnitudes and effects on the inferred orientations have been reported. Only relying on the specification of tool imprecision may considerably underestimate the actual uncertainty space. The present work identifies nine sources of uncertainties, develops inference models of their magnitudes, and points out possible implications for the inference on orientation models and thereby effects on downstream models. The uncertainty analysis in this work builds on a unique data set from site investigations, performed by the Swedish Nuclear Fuel and Waste Management Co. (SKB). During these investigations, more than 70 boreholes with a maximum depth of 1 km were drilled in crystalline rock with a cumulative length of more than 34 km including almost 200,000 single fracture intercepts. The work presented, hence, relies on orientation of fractures. However, the techniques to infer the magnitude of orientation uncertainty may be applied to all types of structures and lineated objects in boreholes. The uncertainties are not solely detrimental, but can be valuable, provided that the reason for their presence is properly understood and the magnitudes correctly inferred. The main findings of this work are as follows: (1) knowledge of the orientation uncertainty is crucial in order to be able to infer correct orientation model and parameters coupled to the fracture sets; (2) it is important to perform multiple measurements to be able to infer the actual uncertainty instead of relying on the theoretical uncertainty provided by the manufacturers; (3) it is important to use the most appropriate tool for the prevailing circumstances; and (4) the single most important parameter to decrease the uncertainty space is to avoid drilling steeper than about −80°.
Journal Article