Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
436 result(s) for "physics-informed machine learning"
Sort by:
Improving River Routing Using a Differentiable Muskingum‐Cunge Model and Physics‐Informed Machine Learning
Recently, rainfall‐runoff simulations in small headwater basins have been improved by methodological advances such as deep neural networks (NNs) and hybrid physics‐NN models—particularly, a genre called differentiable modeling that intermingles NNs with physics to learn relationships between variables. However, hydrologic routing simulations, necessary for simulating floods in stem rivers downstream of large heterogeneous basins, had not yet benefited from these advances and it was unclear if the routing process could be improved via coupled NNs. We present a novel differentiable routing method (δMC‐Juniata‐hydroDL2) that mimics the classical Muskingum‐Cunge routing model over a river network but embeds an NN to infer parameterizations for Manning's roughness (n) and channel geometries from raw reach‐scale attributes like catchment areas and sinuosity. The NN was trained solely on downstream hydrographs. Synthetic experiments show that while the channel geometry parameter was unidentifiable, n can be identified with moderate precision. With real‐world data, the trained differentiable routing model produced more accurate long‐term routing results for both the training gage and untrained inner gages for larger subbasins (>2,000 km2) than either a machine learning model assuming homogeneity, or simply using the sum of runoff from subbasins. The n parameterization trained on short periods gave high performance in other periods, despite significant errors in runoff inputs. The learned n pattern was consistent with literature expectations, demonstrating the framework's potential for knowledge discovery, but the absolute values can vary depending on training periods. The trained n parameterization can be coupled with traditional models to improve national‐scale hydrologic flood simulations. Key Points A novel differentiable routing model can learn effective river routing parameterization, recovering channel roughness in synthetic runs With short periods of real training data, we can improve streamflow in large rivers compared to models not considering routing For basins >2,000 km2, our framework outperformed deep learning models that assume homogeneity, despite bias in the runoff forcings
Streamflow Prediction at the Intersection of Physics and Machine Learning: A Case Study of Two Mediterranean‐Climate Watersheds
Accurate streamflow predictions are essential for water resources management. Recent studies have examined the use of hybrid models that integrate machine learning models with process‐based (PB) hydrologic models to improve streamflow predictions. Yet, there are many open questions regarding optimal hybrid model construction, especially in Mediterranean‐climate watersheds that experience pronounced wet and dry seasons. In this study, we performed model benchmarking to (a) compare hybrid model performance to PB and machine learning models and (b) examine the sensitivity of hybrid model performance to PB model parameter calibration, structural complexity, and variable selection. Hybrid models were generated by post‐processing process‐based models using Long Short‐Term Memory neural networks. Models were benchmarked within two northern California watersheds that are managed for both municipal water supplies and aquatic habitat. Though model performance varied substantially by watershed and error metric, calibrated hybrid models frequently outperformed both the machine learning model (for 72% of watershed‐model‐metric combinations) and the calibrated process‐based models (for 79% of combinations). Furthermore, hybrid models were relatively insensitive to PB model calibration and structural complexity, but sensitive to PB model variable selection. Our results demonstrate that hybrid models can improve streamflow prediction in Mediterranean‐climate watersheds. Additionally, hybrid model insensitivity to PB model parameter calibration and structural complexity suggests that uncalibrated or less complex PB models could be used in hybrid models without any loss of streamflow prediction accuracy, improving model construction efficiency. Moreover, hybrid model sensitivity to the selection of PB model variables suggests a strategy for diagnosing poorly performing PB model components. Key Points Hybrid streamflow prediction models frequently outperformed both machine learning and process‐based (PB) models Hybrid models were relatively insensitive to PB model calibration and structural complexity, but sensitive to PB model variable selection Hybrid models can improve streamflow prediction accuracy, efficiency, and diagnostics in Mediterranean‐climate watersheds
Physics‐Informed Deep‐Learning For Elasticity: Forward, Inverse, and Mixed Problems
Elastography is a medical imaging technique used to measure the elasticity of tissues by comparing ultrasound signals before and after a light compression. The lateral resolution of ultrasound is much inferior to the axial resolution. Current elastography methods generally require both axial and lateral displacement components, making them less effective for clinical applications. Additionally, these methods often rely on the assumption of material incompressibility, which can lead to inaccurate elasticity reconstruction as no materials are truly incompressible. To address these challenges, a new physics‐informed deep‐learning method for elastography is proposed. This new method integrates a displacement network and an elasticity network to reconstruct the Young's modulus field of a heterogeneous object based on only a measured axial displacement field. It also allows for the removal of the assumption of material incompressibility, enabling the reconstruction of both Young's modulus and Poisson's ratio fields simultaneously. The authors demonstrate that using multiple measurements can mitigate the potential error introduced by the “eggshell” effect, in which the presence of stiff material prevents the generation of strain in soft material. These improvements make this new method a valuable tool for a wide range of applications in medical imaging, materials characterization, and beyond. ElastNet learns the Young's modulus field of a heterogeneous object based on a measured displacement field. The predicted stress tensor is calculated by the encoded elastic constitutive relation based on the strain and Young's modulus. The training procedure minimizes the unbalanced forces with a physical constraint and updates the predicted Young's modulus using backpropagation.
Learning Constitutive Relations From Soil Moisture Data via Physically Constrained Neural Networks
The constitutive relations of the Richardson‐Richards equation encode the macroscopic properties of soil water retention and conductivity. These soil hydraulic functions are commonly represented by models with a handful of parameters. The limited degrees of freedom of such soil hydraulic models constrain our ability to extract soil hydraulic properties from soil moisture data via inverse modeling. We present a new free‐form approach to learning the constitutive relations using physically constrained neural networks. We implemented the inverse modeling framework in a differentiable modeling framework, JAX, to ensure scalability and extensibility. For efficient gradient computations, we implemented implicit differentiation through a nonlinear solver for the Richardson‐Richards equation. We tested the framework against synthetic noisy data and demonstrated its robustness against varying magnitudes of noise and degrees of freedom of the neural networks. We applied the framework to soil moisture data from an upward infiltration experiment and demonstrated that the neural network‐based approach was better fitted to the experimental data than a parametric model and that the framework can learn the constitutive relations. Key Points We developed a fully‐differentiable solver for the Richardson‐Richards equation The constitutive relations are represented by physically constrained neural networks The framework can be used to extract soil hydraulic properties without assuming coupling between the constitutive relations
Continuous Physics‐Informed Learning Expedited Battery Mechanism Decoupling
Accurate prediction of battery behavior under different dynamic operating conditions is critical for both fundamental research and practical applications. However, the diversity of emerging materials and cell architectures presents significant challenges to the generalizability of conventional prognostic approaches. Here, a novel physics‐informed battery modeling network (PIBMN) that integrates data‐driven learning with physical priors, enabling continuous parameter adaptation and broad applicability across cell formats and chemistries, is proposed. PIBMN effectively captures both fast and slow dynamic responses under a wide range of load profiles, applicable to both commercial and laboratory‐scale cells. By maintaining nonlinear expressivity while ensuring numerical stability, the model yields high‐fidelity, interpretable representations of internal electrochemical states. Beyond conventional health prognostics, PIBMN introduces a novel capability to decouple complex kinetics processes and concurrently track terminal voltage in real time, enabling mechanistic diagnostics with high resolution. As such, PIBMN establishes a versatile and scalable framework for in‐line quality control, adaptive cell‐specific battery management, and data‐informed optimization of next‐generation battery manufacturing processes. A battery modeling framework combining dual neural networks with physical prior is presented to track internal mechanisms, decouple overpotential components, and early predict aging trajectories using only DC data, eliminating the need for additional measurements. The proposed framework based on PINN and PIKAN is well aligned with the current drive toward explainable, data‐efficient AI models in energy systems.
Prediction of Oil Recovery Factor in Stratified Reservoirs after Immiscible Water-Alternating Gas Injection Based on PSO-, GSA-, GWO-, and GA-LSSVM
In this study, we solve the challenge of predicting oil recovery factor (RF) in layered heterogeneous reservoirs after 1.5 pore volumes of water-, gas- or water-alternating-gas (WAG) injection. A dataset of ~2500 reservoir simulations is analyzed based on a Black Oil 2D Model with different combinations of reservoir heterogeneity, WAG hysteresis, gravity influence, mobility ratios and WAG ratios. In the first model MOD1, RF is correlated with one input (an effective WAG mobility ratio M*). Good correlation (Pearson coefficient −0.94), but with scatter, motivated a second model MOD2 using eight input parameters: water–oil and gas–oil mobility ratios, water–oil and gas–oil gravity numbers, a reservoir heterogeneity factor, two hysteresis parameters and water fraction. The two mobility ratios exhibited the strongest correlation with RF (Pearson coefficient −0.57 for gas-oil and −0.48 for water-oil). LSSVM was applied in MOD2 and trained using different optimizers: PSO, GA, GWO and GSA. A physics-based adaptation of the dataset was proposed to properly handle the single-phase injection. A total of 70% of the data was used for training, 15% for validation and 15% for testing. GWO and PSO optimized the model equally well (R2 = 0.9965 on the validation set), slightly better than GA and GSA (R2 = 0.9963). The performance metrics for MOD1 in the total dataset were: RMSE = 0.050 and R2 = 0.889; MOD2: RMSE = 0.0080 and R2 = 0.998. WAG outperformed single-phase injection, in some cases with 0.3 units higher RF. The benefits of WAG increased with stronger hysteresis. The LSSVM model could be trained to be less dependent on hysteresis and the non-injected phase during single-phase injection.
Machine Learning for Shape Memory Graphene Nanoribbons and Applications in Biomedical Engineering
Shape memory materials have been playing an important role in a wide range of bioengineering applications. At the same time, recent developments of graphene-based nanostructures, such as nanoribbons, have demonstrated that, due to the unique properties of graphene, they can manifest superior electronic, thermal, mechanical, and optical characteristics ideally suited for their potential usage for the next generation of diagnostic devices, drug delivery systems, and other biomedical applications. One of the most intriguing parts of these new developments lies in the fact that certain types of such graphene nanoribbons can exhibit shape memory effects. In this paper, we apply machine learning tools to build an interatomic potential from DFT calculations for highly ordered graphene oxide nanoribbons, a material that had demonstrated shape memory effects with a recovery strain up to 14.5% for 2D layers. The graphene oxide layer can shrink to a metastable phase with lower constant lattice through the application of an electric field, and returns to the initial phase through an external mechanical force. The deformation leads to an electronic rearrangement and induces magnetization around the oxygen atoms. DFT calculations show no magnetization for sufficiently narrow nanoribbons, while the machine learning model can predict the suppression of the metastable phase for the same narrower nanoribbons. We can improve the prediction accuracy by analyzing only the evolution of the metastable phase, where no magnetization is found according to DFT calculations. The model developed here allows also us to study the evolution of the phases for wider nanoribbons, that would be computationally inaccessible through a pure DFT approach. Moreover, we extend our analysis to realistic systems that include vacancies and boron or nitrogen impurities at the oxygen atomic positions. Finally, we provide a brief overview of the current and potential applications of the materials exhibiting shape memory effects in bioengineering and biomedical fields, focusing on data-driven approaches with machine learning interatomic potentials.
Transport physics‐informed reinforcement learning agents deployed in standalone infusion pumps for managing multidrug delivery in critical care
Managing delivery of complex multidrug infusions in anesthesia and critical care presents a significant clinical challenge. Current practices relying on manual control of infusion pumps often result in unpredictable drug delivery profiles and dosing errors—key issues highlighted by the United States Food and Drug Administration (FDA). To address these issues, we introduce the SMART (synchronized‐pump management algorithms for reliable therapies) framework, a novel approach that leverages low Reynolds number drug transport physics and machine learning to accurately manage multidrug infusions in real‐time. SMART is activated based on the Shafer number (Sh), a novel non‐dimensional number that quantifies the relative magnitude of a drug's therapeutic action timescale to its transport timescale within infusion manifolds. SMART is useful when Sh<1, where drug transport becomes the rate limiting step in achieving the desired therapeutic effects. When activated, SMART monitors multidrug concentrations within infusion manifolds and leverages this information to perform end‐to‐end management of drug delivery using an ensemble of deterministic and deep reinforcement learning (RL) decision networks. Notably, SMART RL networks employ differentially sampled split buffer architecture that accelerates learning and improves performance by seamlessly combining deterministic predictions with RL experience during training. SMART deployed in standalone infusion pumps under simulated clinical conditions outperformed state‐of‐the‐art manual control protocols. This framework has the potential to revolutionize critical care by enhancing accuracy of medication delivery and reducing cognitive workloads. Beyond critical care, the ability to accurately manage multi‐liquid delivery via complex manifolds will have important bearings for manufacturing and process control.
Physics-informed machine learning: case studies for weather and climate modelling
Machine learning (ML) provides novel and powerful ways of accurately and efficiently recognizing complex patterns, emulating nonlinear dynamics, and predicting the spatio-temporal evolution of weather and climate processes. Off-the-shelf ML models, however, do not necessarily obey the fundamental governing laws of physical systems, nor do they generalize well to scenarios on which they have not been trained. We survey systematic approaches to incorporating physics and domain knowledge into ML models and distill these approaches into broad categories. Through 10 case studies, we show how these approaches have been used successfully for emulating, downscaling, and forecasting weather and climate processes. The accomplishments of these studies include greater physical consistency, reduced training time, improved data efficiency, and better generalization. Finally, we synthesize the lessons learned and identify scientific, diagnostic, computational, and resource challenges for developing truly robust and reliable physics-informed ML models for weather and climate processes. This article is part of the theme issue ‘Machine learning for weather and climate modelling’.
A Review of Physics-Informed Machine Learning in Fluid Mechanics
Physics-informed machine-learning (PIML) enables the integration of domain knowledge with machine learning (ML) algorithms, which results in higher data efficiency and more stable predictions. This provides opportunities for augmenting—and even replacing—high-fidelity numerical simulations of complex turbulent flows, which are often expensive due to the requirement of high temporal and spatial resolution. In this review, we (i) provide an introduction and historical perspective of ML methods, in particular neural networks (NN), (ii) examine existing PIML applications to fluid mechanics problems, especially in complex high Reynolds number flows, (iii) demonstrate the utility of PIML techniques through a case study, and (iv) discuss the challenges and opportunities of developing PIML for fluid mechanics.