Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
2,393
result(s) for
"Calibration and validation"
Sort by:
Latest Progress of the Chinese Meteorological Satellite Program and Core Data Processing Technologies
2019
In this paper, the latest progress, major achievements and future plans of Chinese meteorological satellites and the core data processing techniques are discussed. First, the latest three FengYun (FY) meteorological satellites (FY-2H, FY-3D, and FY-4A) and their primary objectives are introduced. Second, the core image navigation techniques and accuracies of the FY meteorological satellites are elaborated, including the latest geostationary (FY-2/4) and polar-orbit (FY-3) satellites. Third, the radiometric calibration techniques and accuracies of reflective solar bands, thermal infrared bands, and passive microwave bands for FY meteorological satellites are discussed. It also illustrates the latest progress of real-time calibration with the onboard calibration system and validation with different methods, including the vicarious China radiance calibration site calibration, pseudo invariant calibration site calibration, deep convective clouds calibration, and lunar calibration. Fourth, recent progress of meteorological satellite data assimilation applications and quantitative science produce are summarized at length. The main progress is in meteorological satellite data assimilation by using microwave and hyper-spectral infrared sensors in global and regional numerical weather prediction models. Lastly, the latest progress in radiative transfer, absorption and scattering calculations for satellite remote sensing is summarized, and some important research using a new radiative transfer model are illustrated.
Journal Article
Realistic Forest Stand Reconstruction from Terrestrial LiDAR for Radiative Transfer Modelling
2018
Forest biophysical variables derived from remote sensing observations are vital for climate research. The combination of structurally and radiometrically accurate 3D “virtual” forests with radiative transfer (RT) models creates a powerful tool to facilitate the calibration and validation of remote sensing data and derived biophysical products by helping us understand the assumptions made in data processing algorithms. We present a workflow that uses highly detailed 3D terrestrial laser scanning (TLS) data to generate virtual forests for RT model simulations. Our approach to forest stand reconstruction from a co-registered point cloud is unique as it models each tree individually. Our approach follows three steps: (1) tree segmentation; (2) tree structure modelling and (3) leaf addition. To demonstrate this approach, we present the measurement and construction of a one hectare model of the deciduous forest in Wytham Woods (Oxford, UK). The model contains 559 individual trees. We matched the TLS data with traditional census data to determine the species of each individual tree and allocate species-specific radiometric properties. Our modelling framework is generic, highly transferable and adjustable to data collected with other TLS instruments and different ecosystems. The Wytham Woods virtual forest is made publicly available through an online repository.
Journal Article
Efficient Urban Runoff Quantity and Quality Modelling Using SWMM Model and Field Data in an Urban Watershed of Tehran Metropolis
by
Sañudo-Fontaneda, Luis Angel
,
Zakizadeh, Fariba
,
Salajegheh, Ali
in
Calibration
,
Kinematics
,
Literature reviews
2022
This study aims to calibrate and validate the EPA Storm Water Management Model from field measurements of rainfall and runoff, in order to simulate the rainfall-runoff process in an urban watershed of Tehran metropolis, Iran. During and after three significant storm events, the flow rates, total suspended solids (TSS), total phosphorus (TP), and total Kjeldahl nitrogen (TKN) concentrations were measured at the outlet of the catchment, and were used in the model calibration and validation process. The performance of the SWMM model was evaluated based on the statistical criteria, as well as graphical techniques. In this study, a local sensitivity analysis was carried out to identify the key model parameters, show that “the percentage of impervious surface in each subwatershed had the most effect on the model output”. Based on the analysis of the results, SWMM model calibration and validation can be judged as satisfactory, and the goodness-of-fit indices for simulating runoff quality and quantity are placed in acceptable ranges. The adjustment obtained for the variations in the measured and simulated flow rates, pollutograph concentrations, total pollutant load, peak concentration, and the event mean concentration (EMC) confirms the considerable predictive capability of the SWMM model when it is well calibrated by using field measurements.
Journal Article
A Novel Time Domain Reflectometry (TDR) System for Water Content Estimation in Soils: Development and Application
by
Hassan, Shawcat Basel Mostafa
,
Coppola, Antonio
,
Comegna, Alessandro
in
Accuracy
,
Dielectric properties
,
Electric fields
2025
Nowadays, there is a particular need to estimate soil water content accurately over space and time scales in various applications. For example, precision agriculture, as well as the fields of geology, ecology, and hydrology, necessitate rapid, onsite water content measurements. The time domain reflectometry (TDR) technique is a geophysical method that allows, in a time-varying electric field, the determination of dielectric permittivity and electrical conductivity for a wide class of porous materials. Measuring the volumetric water content in soils is the most frequent application of TDR in soil science and soil hydrology. TDR has grown in popularity over the last 40 years because it is a practical and non-destructive technique that provides laboratory and field-scale measurements. However, a significant limitation of this technique is the relatively high cost of TDR devices, despite the availability of a range of commercial systems with varying prices. This paper aimed to design and implement a low-cost, compact TDR device tailored for classical hydrological applications. A series of laboratory experiments were carried out on soils of different textures to calibrate and validate the proposed measuring system. The results show that the device can be used to obtain predictions for monitoring soil water status with acceptable accuracy (R2 = 0.95).
Journal Article
Assessing the impact of climate change on water requirement and yield of sugarcane over different agro-climatic zones of Tamil Nadu
2024
The DSSAT CANEGRO model was calibrated and verified using field experimental data from five Tamil Nadu Agroclimatic Zones (1981–2022). The genetic coefficients of the sugarcane cultivar (CO-86032) were calculated. R
2
obtained between measured and simulated stalk fresh mass was 0.9 with the nRMSE (0.01) and RMSE (1.6) and R
2
between measured and simulated sucrose mass was 0.9 with the nRMSE (0.16) and RMSE (1.2). For yield R
2
obtained between measured and simulated was 0.9 with the nRMSE (0.01) and RMSE (1.6). As a result, the CANEGRO model may be used to mimic the phenology and yield features of the sugarcane cultivar in Tamil Nadu's Agro Climatic Zones. Temperature increases in Agro Climatic Zones resulted in varying yield reductions, with 2 °C increases causing a 3% loss, 3 °C increases 5%, and 4 °C increases 9%. The Water Requirement rose throughout all of the ACZ due to the high temperature, but to differing degrees. A 2 °C increase often results in an average 4% increase in the WR. 3 °C rise in temperature increased WR to 9% and WR rose by 13% when the temperature was raised by 4 °C.
Journal Article
Experimental Calibration and Validation of a Simulation Model for Fault Detection of HVAC Systems and Application to a Case Study
by
Filomena, Vincenzo
,
Rosato, Antonio
,
Guarino, Francesco
in
air-handling units
,
fault detection and diagnosis
,
HVAC
2020
Automated fault detection and diagnostics (FDD) could provide a cornerstone for predictive maintenance of heating, ventilation and air-conditioning (HVAC) systems based on the development of simulation models able to accurately compare the faulty operation with respect to nominal conditions. In this paper, several experiments have been carried out for assessing the performance of the HVAC unit (nominal cooling/heating capacity of 5.0/5.0 kW) controlling the thermo-hygrometric comfort inside a 4.0 × 4.0 × 3.6 m test room at the Department of Architecture and Industrial Design of the University of Campania Luigi Vanvitelli (Italy); then, a detailed dynamic simulation model has been developed and validated by contrasting the predictions with the measured data. The model has also been used to analyze the dynamic variations of key parameters associated to faulty operation in comparison to normal performance, in order to identify simplified rules for detection of any non-optimal states of HVAC devices. Finally, the simulated performance of the HVAC unit has also been investigated while serving a typical Italian building office with and without the occurrence of typical faults with the main aim of assessing the impact of the faults on thermo-hygrometric comfort conditions as well as electric energy consumption.
Journal Article
On the Design of Radar Corner Reflectors for Deformation Monitoring in Multi-Frequency InSAR
2017
Trihedral corner reflectors are being increasingly used as point targets in deformation monitoring studies using interferometric synthetic aperture radar (InSAR) techniques. The frequency and size dependence of the corner reflector Radar Cross Section (RCS) means that no single design can perform equally in all the possible imaging modes and radar frequencies available on the currently orbiting Synthetic Aperture Radar (SAR) satellites. Therefore, either a corner reflector design tailored to a specific data type or a compromise design for multiple data types is required. In this paper, I outline the practical and theoretical considerations that need to be made when designing appropriate radar targets, with a focus on supporting multi-frequency SAR data. These considerations are tested by performing field experiments on targets of different size using SAR images from TerraSAR-X, COSMO-SkyMed and RADARSAT-2. Phase noise behaviour in SAR images can be estimated by measuring the Signal-to-Clutter ratio (SCR) in individual SAR images. The measured SCR of a point target is dependent on its RCS performance and the influence of clutter near to the deployed target. The SCR is used as a metric to estimate the expected InSAR displacement error incurred by the design of each target and to validate these observations against theoretical expectations. I find that triangular trihedral corner reflectors as small as 1 m in dimension can achieve a displacement error magnitude of a tenth of a millimetre or less in medium-resolution X-band data. Much larger corner reflectors (2.5 m or greater) are required to achieve the same displacement error magnitude in medium-resolution C-band data. Compromise designs should aim to satisfy the requirements of the lowest SAR frequency to be used, providing that these targets will not saturate the sensor of the highest frequency to be used. Finally, accurate boresight alignment of the corner reflector can be critical to the overall target performance. Alignment accuracies better than 4° in azimuth and elevation will incur a minimal impact on the displacement error in X and C-band data.
Journal Article
Toward Predictive Multiscale Modeling of Vascular Tumor Growth
by
Rahman, Mohammad M.
,
Oden, J. Tinsley
,
Almeida, Regina C.
in
Algorithms
,
Angiogenesis
,
Big Data
2016
New directions in medical and biomedical sciences have gradually emerged over recent years that will change the way diseases are diagnosed and treated and are leading to the redirection of medicine toward patient-specific treatments. We refer to these new approaches for studying biomedical systems as predictive medicine, a new version of medical science that involves the use of advanced computer models of biomedical phenomena, high-performance computing, new experimental methods for model data calibration, modern imaging technologies, cutting-edge numerical algorithms for treating large stochastic systems, modern methods for model selection, calibration, validation, verification, and uncertainty quantification, and new approaches for drug design and delivery, all based on predictive models. The methodologies are designed to study events at multiple scales, from genetic data, to sub-cellular signaling mechanisms, to cell interactions, to tissue physics and chemistry, to organs in living human subjects. The present document surveys work on the development and implementation of predictive models of vascular tumor growth, covering aspects of what might be called modeling-and-experimentally based computational oncology. The work described is that of a multi-institutional team, centered at ICES with strong participation by members at M. D. Anderson Cancer Center and University of Texas at San Antonio. This exposition covers topics on signaling models, cell and cell-interaction models, tissue models based on multi-species mixture theories, models of angiogenesis, and beginning work of drug effects. A number of new parallel computer codes for implementing finite-element methods, multi-level Markov Chain Monte Carlo sampling methods, data classification methods, stochastic PDE solvers, statistical inverse algorithms for model calibration and validation, models of events at different spatial and temporal scales is presented. Importantly, new methods for model selection in the presence of uncertainties fundamental to predictive medical science, are described which are based on the notion of Bayesian model plausibilities. Also, as part of this general approach, new codes for determining the sensitivity of model outputs to variations in model parameters are described that provide a basis for assessing the importance of model parameters and controlling and reducing the number of relevant model parameters. Model specific data is to be accessible through careful and model-specific platforms in the Tumor Engineering Laboratory. We describe parallel computer platforms on which large-scale calculations are run as well as specific time-marching algorithms needed to treat stiff systems encountered in some phase-field mixture models. We also cover new non-invasive imaging and data classification methods that provide in vivo data for model validation. The study concludes with a brief discussion of future work and open challenges.
Journal Article
Systematic Modeling of Municipal Wastewater Activated Sludge Process and Treatment Plant Capacity Analysis Using GPS-X
by
Mu’azu, Nuhu Dalhat
,
Alagha, Omar
,
Anil, Ismail
in
Biomass
,
Calibration
,
Chemical oxygen demand
2020
Mathematical modeling has become an indispensable tool for sustainable wastewater management, especially for the simulation of complex biochemical processes involved in the activated sludge process (ASP), which requires a substantial amount of data related to wastewater and sludge characteristics as well as process kinetics and stoichiometry. In this study, a systematic approach for calibration of the activated sludge model one (ASM1) model for a real municipal wastewater ASP was undertaken in GPS-X. The developed model was successfully validated while meeting the assumption of the model’s constant stoichiometry and kinetic coefficients for any plant influent compositions. The influences of vital ASP parameters on the treatment plant performance and capacity analysis for meeting local discharge limits were also investigated. Lower influent chemical oxygen demand in mgO2/L (COD) could inhibit effective nitrification and denitrification, while beyond 250 mgO2/L, there is a tendency for effluent quality to breach the regulatory limit. The plant performance can be satisfactory for handling even higher influent volumes up to 60,000 m3/d and organic loading when Total Suspended Solids/Volatile Suspended Solids (VSS/TSS) and particulate COD (XCOD)/VSS are maintained above 0.7 and 1, respectively. The wasted activated sludge (WAS) has more impact on the effluent quality compared to recycle activated sludge (RAS) with significant performance improvement when the WAS was increased from 3000 to 9000 m3/d. Hydraulic retention time (HRT) > 6 h and solids retention time (SRT) < 7 days resulted in better plant performance with the SRT having greater impact compared with HRT. The plant performance could be sustained for a quite appreciable range of COD/5-day Biochemical Oxygen Demand (BOD5 in mgO2/L) ratio, Mixed Liquor Suspended Solid (MLSS) of up to 6000 mg/L, and when BOD5/total nitrogen (TN) and COD/TN are comparatively at higher values. This work demonstrated a systematic approach for estimation of the wastewater treatment plant (WWTP) ASP parameters and the high modeling capabilities of ASM1 in GPS-X when respirometry tests data are lacking.
Journal Article
The Ground to Space CALibration Experiment (G-SCALE): simultaneous validation of UAV, airborne, and satellite imagers for earth observation using specular targets
by
Russell, Brandon J
,
Durell, Chris
,
Arroyo-Mora, Juan Pablo
in
airborne hyperspectral imaging
,
Aircraft
,
Calibration
2023
The objective of the Ground to Space CALibration Experiment (G-SCALE) is to demonstrate the use of convex mirrors as a radiometric and spatial calibration and validation technology for Earth Observation assets, operating at multiple altitudes and spatial scales. Specifically, point sources with NIST-traceable absolute radiance signal are evaluated for simultaneous vicarious calibration of multi- and hyperspectral sensors in the VNIR/SWIR range, aboard Unmanned Aerial Vehicles (UAVs), manned aircraft, and satellite platforms. We introduce the experimental process, field site, instrumentation, and preliminary results of the G-SCALE, providing context for forthcoming papers that will detail the results of intercomparison between sensor technologies and remote sensing applications utilizing the mirror-based calibration approach, which is scalable across a wide range of pixel sizes with appropriate facilities. The experiment was carried out at the Rochester Institute of Technology’s Tait Preserve in Penfield, NY, USA on 23 July 2021. The G-SCALE represents a unique, international collaboration between commercial, academic, and government entities for the purpose of evaluating a novel method to improve vicarious calibration and validation for Earth Observation.
Journal Article