Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
19,061 result(s) for "Gaussian process"
Sort by:
Nonlinear information fusion algorithms for data-efficient multi-fidelity modelling
Multi-fidelity modelling enables accurate inference of quantities of interest by synergistically combining realizations of low-cost/low-fidelity models with a small set of high-fidelity observations. This is particularly effective when the low- and high-fidelity models exhibit strong correlations, and can lead to significant computational gains over approaches that solely rely on high-fidelity models. However, in many cases of practical interest, low-fidelity models can only be well correlated to their high-fidelity counterparts for a specific range of input parameters, and potentially return wrong trends and erroneous predictions if probed outside of their validity regime. Here we put forth a probabilistic framework based on Gaussian process regression and nonlinear autoregressive schemes that is capable of learning complex nonlinear and space-dependent cross-correlations between models of variable fidelity, and can effectively safeguard against low-fidelity models that provide wrong trends. This introduces a new class of multi-fidelity information fusion algorithms that provide a fundamental extension to the existing linear autoregressive methodologies, while still maintaining the same algorithmic complexity and overall computational cost. The performance of the proposed methods is tested in several benchmark problems involving both synthetic and real multi-fidelity datasets from computational fluid dynamics simulations.
Data-driven forecasting of high-dimensional chaotic systems with long short-term memory networks
We introduce a data-driven forecasting method for high-dimensional chaotic systems using long short-term memory (LSTM) recurrent neural networks. The proposed LSTM neural networks perform inference of high-dimensional dynamical systems in their reduced order space and are shown to be an effective set of nonlinear approximators of their attractor. We demonstrate the forecasting performance of the LSTM and compare it with Gaussian processes (GPs) in time series obtained from the Lorenz 96 system, the Kuramoto–Sivashinsky equation and a prototype climate model. The LSTM networks outperform the GPs in short-term forecasting accuracy in all applications considered. A hybrid architecture, extending the LSTM with a mean stochastic model (MSM–LSTM), is proposed to ensure convergence to the invariant measure. This novel hybrid method is fully data-driven and extends the forecasting capabilities of LSTM networks.
Forecasting wholesale prices of yellow corn through the Gaussian process regression
For market players and policy officials, commodity price forecasts are crucial problems that are challenging to address due to the complexity of price time series. Given its strategic importance, corn crops are hardly an exception. The current paper evaluates the forecasting issue for China’s weekly wholesale price index for yellow corn from January 1, 2010 to January 10, 2020. We develop a Gaussian process regression model using cross validation and Bayesian optimizations over various kernels and basis functions that could effectively handle this sophisticated commodity price forecast problem. The model provides precise out-of-sample forecasts from January 4, 2019 to January 10, 2020, with a relative root mean square error, root mean square error, and mean absolute error of 1.245%, 1.605, and 0.936, respectively. The models developed here might be used by market players for market evaluations and decision-making as well as by policymakers for policy creation and execution.
Predictions of steel price indices through machine learning for the regional northeast Chinese market
Projections of commodity prices have long been a significant source of dependence for investors and the government. This study investigates the challenging topic of forecasting the daily regional steel price index in the northeast Chinese market from January 1, 2010, to April 15, 2021. The projection of this significant commodity price indication has not received enough attention in the literature. The forecasting model that is used is Gaussian process regressions, which are trained using a mix of cross-validation and Bayesian optimizations. The models that were built precisely predicted the price indices between January 8, 2019, and April 15, 2021, with an out-of-sample relative root mean square error of 0.5432%. Investors and government officials can use the established models to study pricing and make judgments. Forecasting results can help create comparable commodity price indices when reference data on the price trends suggested by these models are used.
Multivariate Gaussian and Student-t process regression for multi-output prediction
Gaussian process model for vector-valued function has been shown to be useful for multi-output prediction. The existing method for this model is to reformulate the matrix-variate Gaussian distribution as a multivariate normal distribution. Although it is effective in many cases, reformulation is not always workable and is difficult to apply to other distributions because not all matrix-variate distributions can be transformed to respective multivariate distributions, such as the case for matrix-variate Student- t distribution. In this paper, we propose a unified framework which is used not only to introduce a novel multivariate Student- t process regression model (MV-TPR) for multi-output prediction, but also to reformulate the multivariate Gaussian process regression (MV-GPR) that overcomes some limitations of the existing methods. Both MV-GPR and MV-TPR have closed-form expressions for the marginal likelihoods and predictive distributions under this unified framework and thus can adopt the same optimization approaches as used in the conventional GPR. The usefulness of the proposed methods is illustrated through several simulated and real-data examples. In particular, we verify empirically that MV-TPR has superiority for the datasets considered, including air quality prediction and bike rent prediction. At last, the proposed methods are shown to produce profitable investment strategies in the stock markets.
Efficient multiobjective optimization employing Gaussian processes, spectral sampling and a genetic algorithm
Many engineering problems require the optimization of expensive, black-box functions involving multiple conflicting criteria, such that commonly used methods like multiobjective genetic algorithms are inadequate. To tackle this problem several algorithms have been developed using surrogates. However, these often have disadvantages such as the requirement of a priori knowledge of the output functions or exponentially scaling computational cost with respect to the number of objectives. In this paper a new algorithm is proposed, TSEMO, which uses Gaussian processes as surrogates. The Gaussian processes are sampled using spectral sampling techniques to make use of Thompson sampling in conjunction with the hypervolume quality indicator and NSGA-II to choose a new evaluation point at each iteration. The reference point required for the hypervolume calculation is estimated within TSEMO. Further, a simple extension was proposed to carry out batch-sequential design. TSEMO was compared to ParEGO, an expected hypervolume implementation, and NSGA-II on nine test problems with a budget of 150 function evaluations. Overall, TSEMO shows promising performance, while giving a simple algorithm without the requirement of a priori knowledge, reduced hypervolume calculations to approach linear scaling with respect to the number of objectives, the capacity to handle noise and lastly the ability for batch-sequential usage.
Meta-Kriging: Scalable Bayesian Modeling and Inference for Massive Spatial Datasets
Spatial process models for analyzing geostatistical data entail computations that become prohibitive as the number of spatial locations becomes large. There is a burgeoning literature on approaches for analyzing large spatial datasets. In this article, we propose a divide-and-conquer strategy within the Bayesian paradigm. We partition the data into subsets, analyze each subset using a Bayesian spatial process model, and then obtain approximate posterior inference for the entire dataset by combining the individual posterior distributions from each subset. Importantly, as often desired in spatial analysis, we offer full posterior predictive inference at arbitrary locations for the outcome as well as the residual spatial surface after accounting for spatially oriented predictors. We call this approach \"spatial meta-kriging\" (SMK). We do not need to store the entire data in one processor, and this leads to superior scalability. We demonstrate SMK with various spatial regression models including Gaussian processes with Matern and compactly supported correlation functions. The approach is intuitive, easy to implement, and is supported by theoretical results presented in the supplementary material available online. Empirical illustrations are provided using different simulation experiments and a geostatistical analysis of Pacific Ocean sea surface temperature data. Supplementary materials for this article are available online.
Gaussian process-based surrogate modeling framework for process planning in laser powder-bed fusion additive manufacturing of 316L stainless steel
Laser Powder-Bed Fusion (L-PBF) metal-based additive manufacturing (AM) is complex and not fully understood. Successful processing for one material, might not necessarily apply to a different material. This paper describes a workflow process that aims at creating a material data sheet standard that describes regimes where the process can be expected to be robust. The procedure consists of building a Gaussian process-based surrogate model of the L-PBF process that predicts melt pool depth in single-track experiments given a laser power, scan speed, and laser beam size combination. The predictions are then mapped onto a power versus scan speed diagram delimiting the conduction from the keyhole melting controlled regimes. This statistical framework is shown to be robust even for cases where experimental training data might be suboptimal in quality, if appropriate physics-based filters are applied. Additionally, it is demonstrated that a high-fidelity simulation model of L-PBF can equally be successfully used for building a surrogate model, which is beneficial since simulations are getting more efficient and are more practical to study the response of different materials, than to re-tool an AM machine for new material powder.
Using Gaussian process regression to simulate the vibrational Raman spectra of molecular crystals
Vibrational properties of molecular crystals are constantly used as structural fingerprints, in order to identify both the chemical nature and the structural arrangement of molecules. The simulation of these properties is typically very costly, especially when dealing with response properties of materials to e.g. electric fields, which require a good description of the perturbed electronic density. In this work, we use Gaussian process regression (GPR) to predict the static polarizability and dielectric susceptibility of molecules and molecular crystals. We combine this framework with ab initio molecular dynamics to predict their anharmonic vibrational Raman spectra. We stress the importance of data representation, symmetry, and locality, by comparing the performance of different flavors of GPR. In particular, we show the advantages of using a recently developed symmetry-adapted version of GPR. As an examplary application, we choose Paracetamol as an isolated molecule and in different crystal forms. We obtain accurate vibrational Raman spectra in all cases with fewer than 1000 training points, and obtain improvements when using a GPR trained on the molecular monomer as a baseline for the crystal GPR models. Finally, we show that our methodology is transferable across polymorphic forms: we can train the model on data for one crystal structure, and still be able to accurately predict the spectrum for a second polymorph. This procedure provides an independent route to access electronic structure properties when performing force-evaluations on empirical force-fields or machine-learned potential energy surfaces.
Cell to whole organ global sensitivity analysis on a four-chamber heart electromechanics model using Gaussian processes emulators
Cardiac pump function arises from a series of highly orchestrated events across multiple scales. Computational electromechanics can encode these events in physics-constrained models. However, the large number of parameters in these models has made the systematic study of the link between cellular, tissue, and organ scale parameters to whole heart physiology challenging. A patient-specific anatomical heart model, or digital twin, was created. Cellular ionic dynamics and contraction were simulated with the Courtemanche-Land and the ToR-ORd-Land models for the atria and the ventricles, respectively. Whole heart contraction was coupled with the circulatory system, simulated with CircAdapt, while accounting for the effect of the pericardium on cardiac motion. The four-chamber electromechanics framework resulted in 117 parameters of interest. The model was broken into five hierarchical sub-models: tissue electrophysiology, ToR-ORd-Land model, Courtemanche-Land model, passive mechanics and CircAdapt. For each sub-model, we trained Gaussian processes emulators (GPEs) that were then used to perform a global sensitivity analysis (GSA) to retain parameters explaining 90% of the total sensitivity for subsequent analysis. We identified 45 out of 117 parameters that were important for whole heart function. We performed a GSA over these 45 parameters and identified the systemic and pulmonary peripheral resistance as being critical parameters for a wide range of volumetric and hemodynamic cardiac indexes across all four chambers. We have shown that GPEs provide a robust method for mapping between cellular properties and clinical measurements. This could be applied to identify parameters that can be calibrated in patient-specific models or digital twins, and to link cellular function to clinical indexes.