Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
23,051
result(s) for
"Model Calibration"
Sort by:
When are multiobjective calibration trade-offs in hydrologic models meaningful?
by
Wagener, T.
,
Kollat, J. B.
,
Reed, P. M.
in
evolutionary algorithm
,
hydrologic model calibration
,
model diagnostics
2012
This paper applies a four‐objective calibration strategy focusing on peak flows, low flows, water balance, and flashiness to 392 model parameter estimation experiment (MOPEX) watersheds across the United States. Our analysis explores the influence of model structure by analyzing how the multiobjective calibration trade‐offs for two conceptual hydrologic models, the Hydrology Model (HYMOD) and the Hydrologiska Byråns Vattenbalansavdelning (HBV) model, compare for each of the 392 catchments. Our results demonstrate that for modern multiobjective calibration frameworks to identify any meaningful measure of model structural failure, users must be able to carefully control the precision by which they evaluate their trade‐offs. Our study demonstrates that the concept of epsilon‐dominance provides an effective means of attaining bounded and meaningful hydrologic model calibration trade‐offs. When analyzed at an appropriate precision, we found that meaningful multiobjective trade‐offs are far less frequent than prior literature has suggested. However, when trade‐offs do exist at a meaningful precision, they have significant value for supporting hydrologic model selection, distinguishing core model deficiencies, and identifying hydroclimatic regions where hydrologic model prediction is highly challenging. Key Points Prior calibration studies suggest frequent multi‐objective tradeoffs Calibration tradeoffs tend to collapse when considering reasonable precision Meaningful tradeoffs can effectively identify structural deficiencies in models
Journal Article
Quantifying Groundwater Response and Uncertainty in Beaver‐Influenced Mountainous Floodplains Using Machine Learning‐Based Model Calibration
2025
Beavers (Castor canadensis) alter river corridor hydrology by creating ponds and inundating floodplains, and thereby improving surface water storage. However, the impact of inundation on groundwater, particularly in mountainous alluvial floodplains with permeable gravel/cobble layers overlain by a soil layer, remains uncertain. Numerical modeling across various floodplain structures considers topographic and sediment complexity and multidirectional flow, linking inundation to groundwater response. This study develops a model‐data integration workflow to address uncertainty in groundwater response to beaver‐induced inundations in a mountainous alluvial floodplain in the Upper Colorado River Basin. Uncertain factors include seasonal hydrologic dynamics, hydraulic conductivities, floodplain structures, and meteorological forcings. We employed an ensemble of groundwater models, based on geophysical and hydrologic data, with machine learning‐based calibration using a neural density estimator. This allowed us to quantify the vertical flux from the soil layer to the permeable gravel bed, the down‐valley underflow within the gravel bed, and their ratios. Results show a significant increase in the vertical flux relative to down‐valley underflow, from 2% $\\%$ during dry pond periods to 20% $\\%$ during wet periods, serving as an analogy for conditions without and with beaver ponds. The study highlights the influence of floodplain structure on groundwater storage, water balance, and water quality impacted by beaver ponds. A thick gravel bed layer, with a large down‐valley underflow, minimizes the effect of beaver‐induced inundation on water quality. We emphasize the need for field‐scale measurements of floodplain structure and improved characterization of evapotranspiration changes to reduce uncertainty in groundwater response.
Journal Article
An Explainable Machine Learning Approach for COVID-19’s Impact on Mood States of Children and Adolescents during the First Lockdown in Greece
by
Dimitrios Priftis
,
Ioanna Giannopoulou
,
Aspasia Serdari
in
Anxiety
,
Child psychopathology
,
Children
2022
The global spread of COVID-19 led the World Health Organization to declare a pandemic on 11 March 2020. To decelerate this spread, countries have taken strict measures that have affected the lifestyles and economies. Various studies have focused on the identification of COVID-19’s impact on the mental health of children and adolescents via traditional statistical approaches. However, a machine learning methodology must be developed to explain the main factors that contribute to the changes in the mood state of children and adolescents during the first lockdown. Therefore, in this study an explainable machine learning pipeline is presented focusing on children and adolescents in Greece, where a strict lockdown was imposed. The target group consists of children and adolescents, recruited from children and adolescent mental health services, who present mental health problems diagnosed before the pandemic. The proposed methodology imposes: (i) data collection via questionnaires; (ii) a clustering process to identify the groups of subjects with amelioration, deterioration and stability to their mood state; (iii) a feature selection process to identify the most informative features that contribute to mood state prediction; (iv) a decision-making process based on an experimental evaluation among classifiers; (v) calibration of the best-performing model; and (vi) a post hoc interpretation of the features’ impact on the best-performing model. The results showed that a blend of heterogeneous features from almost all feature categories is necessary to increase our understanding regarding the effect of the COVID-19 pandemic on the mood state of children and adolescents.
Journal Article
kuenm: an R package for detailed development of ecological niche models using Maxent
by
Peterson, A. Townsend
,
Osorio-Olvera, Luis
,
Barve, Narayani
in
Automation
,
Biodiversity
,
Biogeography
2019
Ecological niche modeling is a set of analytical tools with applications in diverse disciplines, yet creating these models rigorously is now a challenging task. The calibration phase of these models is critical, but despite recent attempts at providing tools for performing this step, adequate detail is still missing. Here, we present the kuenm R package, a new set of tools for performing detailed development of ecological niche models using the platform Maxent in a reproducible way.
This package takes advantage of the versatility of R and Maxent to enable detailed model calibration and selection, final model creation and evaluation, and extrapolation risk analysis. Best parameters for modeling are selected considering (1) statistical significance, (2) predictive power, and (3) model complexity. For final models, we enable multiple parameter sets and model transfers, making processing simpler. Users can also evaluate extrapolation risk in model transfers via mobility-oriented parity (MOP) metric.
Use of this package allows robust processes of model calibration, facilitating creation of final models based on model significance, performance, and simplicity. Model transfers to multiple scenarios, also facilitated in this package, significantly reduce time invested in performing these tasks. Finally, efficient assessments of strict-extrapolation risks in model transfers via the MOP and MESS metrics help to prevent overinterpretation in model outcomes.
Journal Article
Calibrating non-probability surveys to estimated control totals using LASSO, with an application to political polling
by
Elliott, Michael R.
,
Valliant, Richard L.
,
Chen, Jack Kuang Tsung
in
Access
,
Adaptive control
,
Bias
2019
Declining response rates and increasing costs have led to greater use of non-probability samples in election polling. But non-probability samples may suffer from selection bias due to differential access, degrees of interest and other factors. Here we estimate voting preference for 19 elections in the US 2014 midterm elections by using large non-probability surveys obtained from SurveyMonkey users, calibrated to estimated control totals using model-assisted calibration combined with adaptive LASSO regression, or the estimated controlled LASSO, ECLASSO. Comparing the bias and root-mean-square error of ECLASSO with traditional calibration methods shows that ECLASSO can be a powerful method for adjusting non-probability surveys even when only a small sample is available from a probability survey. The methodology proposed has potentially broad application across social science and health research, as response rates for probability samples decline and access to non-probability samples increases.
Journal Article
Flat roof hygrothermal performance testing and evaluation
by
Millan, Jose Antonio
,
Sala, Jose Maria
,
Flores-Abascal, Ivan
in
Building codes
,
Building components
,
Buildings
2020
PurposeUnderstanding the dynamic hygrothermal behavior of building elements is very important to ensure the optimal performance of buildings. The Laboratory for Quality Control in Buildings of the Basque Government tested a flat roof designed by a construction company that developed a building to be constructed using prefabricated modules. This is a five to eight floor building with ventilated façade and a flat roof covered by gravel with the possibility of changing it to a green cover. The paper aims to discuss this issue.Design/methodology/approachThe interest of this research was threefold. The first objective was to accurately test, under real dynamic weather conditions, the roof design in a PASLINK test cell to obtain the U-value and the thermal capacitance of the different roof layers, and of the roof as a whole, through the precise calibration of resistance-capacitance mathematical models of the roof. Based on the parameters and experimental information of these calibrated models, a second goal was to calibrate and validate a Wufi model of the roof.FindingsThis second calibrated model was then used to simulate the dynamic hygrothermal behavior of the roof, obtaining the roof’s hourly thermal demand per square meter for a whole year in different locations considered in the Spanish Building Code. These simulations also permitted the authors to study the risk of condensation and mold growth of the tested component under different climatic conditions.Originality/valueThe successful combination of the PASLINK method to calibrate the Wufi hygrothermal model is the main novelty of this research.
Journal Article
Are general circulation models obsolete?
2022
Traditional general circulation models, or GCMs—that is, three-dimensional dynamical models with unresolved terms represented in equations with tunable parameters—have been a mainstay of climate research for several decades, and some of the pioneering studies have recently been recognized by a Nobel prize in Physics. Yet, there is considerable debate around their continuing role in the future. Frequently mentioned as limitations of GCMs are the structural error and uncertainty across models with different representations of unresolved scales and the fact that the models are tuned to reproduce certain aspects of the observed Earth. We consider these shortcomings in the context of a future generation of models that may address these issues through substantially higher resolution and detail, or through the use ofmachine learning techniques to match them better to observations, theory, and process models. It is our contention that calibration, far from being a weakness of models, is an essential element in the simulation of complex systems, and contributes to our understanding of their inner workings. Models can be calibrated to reveal both fine-scale detail and the global response to external perturbations. New methods enable us to articulate and improve the connections between the different levels of abstract representation of climate processes, and our understanding resides in an entire hierarchy of models where GCMs will continue to play a central role for the foreseeable future.
Journal Article
Confronting the Challenge of Modeling Cloud and Precipitation Microphysics
by
Fridlind, Ann M.
,
Xue, Lulin
,
Harrington, Jerry Y.
in
Atmosphere
,
Atmospheric Processes
,
Atmospheric water
2020
In the atmosphere, microphysics refers to the microscale processes that affect cloud and precipitation particles and is a key linkage among the various components of Earth's atmospheric water and energy cycles. The representation of microphysical processes in models continues to pose a major challenge leading to uncertainty in numerical weather forecasts and climate simulations. In this paper, the problem of treating microphysics in models is divided into two parts: (i) how to represent the population of cloud and precipitation particles, given the impossibility of simulating all particles individually within a cloud, and (ii) uncertainties in the microphysical process rates owing to fundamental gaps in knowledge of cloud physics. The recently developed Lagrangian particle‐based method is advocated as a way to address several conceptual and practical challenges of representing particle populations using traditional bulk and bin microphysics parameterization schemes. For addressing critical gaps in cloud physics knowledge, sustained investment for observational advances from laboratory experiments, new probe development, and next‐generation instruments in space is needed. Greater emphasis on laboratory work, which has apparently declined over the past several decades relative to other areas of cloud physics research, is argued to be an essential ingredient for improving process‐level understanding. More systematic use of natural cloud and precipitation observations to constrain microphysics schemes is also advocated. Because it is generally difficult to quantify individual microphysical process rates from these observations directly, this presents an inverse problem that can be viewed from the standpoint of Bayesian statistics. Following this idea, a probabilistic framework is proposed that combines elements from statistical and physical modeling. Besides providing rigorous constraint of schemes, there is an added benefit of quantifying uncertainty systematically. Finally, a broader hierarchical approach is proposed to accelerate improvements in microphysics schemes, leveraging the advances described in this paper related to process modeling (using Lagrangian particle‐based schemes), laboratory experimentation, cloud and precipitation observations, and statistical methods. Plain Language Summary In the atmosphere, microphysics—the small‐scale processes affecting cloud and precipitation particles such as their growth by condensation, evaporation, and melting—is a critical part of Earth's weather and climate. Because it is impossible to simulate every cloud particle individually owing to their sheer number within even a small cloud, atmospheric models have to represent the evolution of particle populations statistically. There are critical gaps in knowledge of the microphysical processes that act on particles, especially for atmospheric ice particles because of their wide variety and intricacy of their shapes. The difficulty of representing cloud and precipitation particle populations and knowledge gaps in cloud processes both introduce important uncertainties into models that translate into uncertainty in weather forecasts and climate simulations, including climate change assessments. We discuss several specific challenges related to these problems. To improve how cloud and precipitation particle populations are represented, we advocate a “particle‐based” approach that addresses several limitations of traditional approaches and has recently gained traction as a tool for cloud modeling. Advances in observations, including laboratory studies, are argued to be essential for addressing gaps in knowledge of microphysical processes. We also advocate using statistical modeling tools to improve how these observations are used to constrain model microphysics. Finally, we discuss a hierarchical approach that combines the various pieces discussed in this article, providing a possible blueprint for accelerating progress in how microphysics is represented in cloud, weather, and climate models. Key Points Microphysics is an important component of weather and climate models, but its representation in current models is highly uncertain Two critical challenges are identified: representing cloud and precipitation particle populations and knowledge gaps in cloud physics A possible blueprint for addressing these challenges is proposed to accelerate progress in improving microphysics schemes
Journal Article
Prediction and Utilization of Malondialdehyde in Exotic Pine Under Drought Stress Using Near-Infrared Spectroscopy
2021
Drought is a major abiotic stress that adversely affects the growth and productivity of plants. Malondialdehyde (MDA), a substance produced by membrane lipids in response to reactive oxygen species (ROS), can be used as a drought indicator to evaluate the degree of plasma membrane damage and the ability of plants to drought stress tolerance. Still measuring MDA is usually a labor- and time-consuming task. In this study, near-infrared (NIR) spectroscopy combined with partial least squares (PLS) was used to obtain rapid and high-throughput measurements of MDA, and the application of this technique to plant drought stress experiments was also investigated. Two exotic conifer tree species, namely, slash pine ( Pinus elliottii ) and loblolly pine ( Pinus taeda ), were used as plant material exposed to drought stress; different types of spectral preprocessing methods and important feature-selection algorithms were applied to the PLS model to calibrate it and obtain the best MDA-predicting model. The results show that the best PLS model is established via the combined treatment of detrended variable–significant multivariate correlation algorithm (DET-sMC), where latent variables (LVs) were 6. This model has a respectable predictive capability, with a correlation coefficient ( R 2 ) of 0.66, a root mean square error (RMSE) of 2.28%, and a residual prediction deviation (RPD) of 1.51, and it was successfully implemented in drought stress experiments as a reliable and non-destructive method to detect the MDA content in real time.
Journal Article
Review of statistical model calibration and validation—from the perspective of uncertainty structures
by
Lee, Guesuk
,
Kim, Wongon
,
Youn, Byeng D.
in
Calibration
,
Computational Mathematics and Numerical Analysis
,
Computer aided engineering
2019
Computer-aided engineering (CAE) is now an essential instrument that aids in engineering decision-making. Statistical model calibration and validation has recently drawn great attention in the engineering community for its applications in practical CAE models. The objective of this paper is to review the state-of-the-art and trends in statistical model calibration and validation, based on the available extensive literature, from the perspective of uncertainty structures. After a brief discussion about uncertainties, this paper examines three problem categories—the forward problem, the inverse problem, and the validation problem—in the context of techniques and applications for statistical model calibration and validation.
Journal Article