Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Series TitleSeries Title
-
Reading LevelReading Level
-
YearFrom:-To:
-
More FiltersMore FiltersContent TypeItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
60,417
result(s) for
"Mathematical and Computational Engineering"
Sort by:
Computational trust models and machine learning
\"This book provides an introduction to computational trust models from a machine learning perspective. After reviewing traditional computational trust models, it discusses a new trend of applying formerly unused machine learning methodologies, such as supervised learning. The application of various learning algorithms, such as linear regression, matrix decomposition, and decision trees, illustrates how to translate the trust modeling problem into a (supervised) learning problem. The book also shows how novel machine learning techniques can improve the accuracy of trust assessment compared to traditional approaches\"-- Provided by publisher.
An introduction to transfer entropy : information flow in complex systems
by
Lizier, Joseph T.
,
Harré, Michael
,
Barnett, Lionel
in
Artificial Intelligence
,
Complex Systems
,
Computer Science
2016
This book considers a relatively new metric in complex systems, transfer entropy, derived from a series of measurements, usually a time series.
Scientific Machine Learning Through Physics–Informed Neural Networks: Where we are and What’s Next
by
Piccialli, Francesco
,
Di Cola, Vincenzo Schiano
,
Giampaolo, Fabio
in
Algorithms
,
Applied mathematics
,
Approximation
2022
Physics-Informed Neural Networks (PINN) are neural networks (NNs) that encode model equations, like Partial Differential Equations (PDE), as a component of the neural network itself. PINNs are nowadays used to solve PDEs, fractional equations, integral-differential equations, and stochastic PDEs. This novel methodology has arisen as a multi-task learning framework in which a NN must fit observed data while reducing a PDE residual. This article provides a comprehensive review of the literature on PINNs: while the primary goal of the study was to characterize these networks and their related advantages and disadvantages. The review also attempts to incorporate publications on a broader range of collocation-based physics informed neural networks, which stars form the vanilla PINN, as well as many other variants, such as physics-constrained neural networks (PCNN), variational hp-VPINN, and conservative PINN (CPINN). The study indicates that most research has focused on customizing the PINN through different activation functions, gradient optimization techniques, neural network structures, and loss function structures. Despite the wide range of applications for which PINNs have been used, by demonstrating their ability to be more feasible in some contexts than classical numerical techniques like Finite Element Method (FEM), advancements are still possible, most notably theoretical issues that remain unresolved.
Journal Article
Recent advances and applications of deep learning methods in materials science
by
Park, Cheol Woo
,
Wolverton, Chris
,
DeCost, Brian
in
Cross cutting
,
Data analysis
,
Data science
2022
Deep learning (DL) is one of the fastest-growing topics in materials data science, with rapidly emerging applications spanning atomistic, image-based, spectral, and textual data modalities. DL allows analysis of unstructured data and automated identification of features. The recent development of large materials databases has fueled the application of DL methods in atomistic prediction in particular. In contrast, advances in image and spectral data have largely leveraged synthetic data enabled by high-quality forward models as well as by generative unsupervised DL methods. In this article, we present a high-level overview of deep learning methods followed by a detailed discussion of recent developments of deep learning in atomistic simulation, materials imaging, spectral analysis, and natural language processing. For each modality we discuss applications involving both theoretical and experimental data, typical modeling approaches with their strengths and limitations, and relevant publicly available software and datasets. We conclude the review with a discussion of recent cross-cutting work related to uncertainty quantification in this field and a brief perspective on limitations, challenges, and potential growth areas for DL methods in materials science.
Journal Article
A Systematic Review of the Whale Optimization Algorithm: Theoretical Foundation, Improvements, and Hybridizations
by
Nadimi-Shahraki, Mohammad H.
,
Asghari Varzaneh, Zahra
,
Mirjalili, Seyedali
in
Algorithms
,
Citations
,
Engineering
2023
Despite the simplicity of the whale optimization algorithm (WOA) and its success in solving some optimization problems, it faces many issues. Thus, WOA has attracted scholars' attention, and researchers frequently prefer to employ and improve it to address real-world application optimization problems. As a result, many WOA variations have been developed, usually using two main approaches improvement and hybridization. However, no comprehensive study critically reviews and analyzes WOA and its variants to find effective techniques and algorithms and develop more successful variants. Therefore, in this paper, first, the WOA is critically analyzed, then the last 5 years' developments of WOA are systematically reviewed. To do this, a new adapted PRISMA methodology is introduced to select eligible papers, including three main stages: identification, evaluation, and reporting. The evaluation stage was improved using three screening steps and strict inclusion criteria to select a reasonable number of eligible papers. Ultimately, 59 improved WOA and 57 hybrid WOA variants published by reputable publishers, including Springer, Elsevier, and IEEE, were selected as eligible papers. Effective techniques for improving and successful algorithms for hybridizing eligible WOA variants are described. The eligible WOA are reviewed in continuous, binary, single-objective, and multi/many-objective categories. The distribution of eligible WOA variants regarding their publisher, journal, application, and authors' country was visualized. It is also concluded that most papers in this area lack a comprehensive comparison with previous WOA variants and are usually compared only with other algorithms. Finally, some future directions are suggested.
Journal Article
A Review on Kalman Filter Models
2023
Kalman Filter (KF) that is also known as linear quadratic estimation filter estimates current states of a system through time as recursive using input measurements in mathematical process model. Thus algorithm is implemented in two steps: in the prediction step an estimation of current state of variables in uncertainty conditions is presented. In the next step, after obtaining the measurement, previous estimation is updated by weighted arithmetic mean. Accordingly, using KF in non-linear systems can be difficult. For nonlinear systems Extended KF (EKF) and Unscented KF (UKF) represent the first-order and higher order linear approximations. KF cannot predict appropriate values for modeling system behavior in more complicated systems. In the current study, in addition to referring to basic methods, a review on recent researches on Multiple Model (MM) filters has been done. More reliable estimations obtain by using two or more filters with different models in parallel, by allocating an estimation to each filter, outputs of each filter are calculated. MM Adaptive Estimation (MMAE) and Interacting MM (IMM) are the most used methods for estimating MMs.
Journal Article
Global Convergence of ADMM in Nonconvex Nonsmooth Optimization
by
Wang, Yu
,
Yin, Wotao
,
Zeng, Jinshan
in
Algorithms
,
Computational Mathematics and Numerical Analysis
,
Continuity (mathematics)
2019
In this paper, we analyze the convergence of the alternating direction method of multipliers (ADMM) for minimizing a nonconvex and possibly nonsmooth objective function,
ϕ
(
x
0
,
…
,
x
p
,
y
)
, subject to coupled linear equality constraints. Our ADMM updates each of the primal variables
x
0
,
…
,
x
p
,
y
, followed by updating the dual variable. We separate the variable
y
from
x
i
’s as it has a special role in our analysis. The developed convergence guarantee covers a variety of nonconvex functions such as piecewise linear functions,
ℓ
q
quasi-norm, Schatten-
q
quasi-norm (
0
<
q
<
1
), minimax concave penalty (MCP), and smoothly clipped absolute deviation penalty. It also allows nonconvex constraints such as compact manifolds (e.g., spherical, Stiefel, and Grassman manifolds) and linear complementarity constraints. Also, the
x
0
-block can be almost any lower semi-continuous function. By applying our analysis, we show, for the first time, that several ADMM algorithms applied to solve nonconvex models in statistical learning, optimization on manifold, and matrix decomposition are guaranteed to converge. Our results provide sufficient conditions for ADMM to converge on (convex or nonconvex) monotropic programs with three or more blocks, as they are special cases of our model. ADMM has been regarded as a variant to the augmented Lagrangian method (ALM). We present a simple example to illustrate how ADMM converges but ALM diverges with bounded penalty parameter
β
. Indicated by this example and other analysis in this paper, ADMM might be a better choice than ALM for some nonconvex
nonsmooth
problems, because ADMM is not only easier to implement, it is also more likely to converge for the concerned scenarios.
Journal Article
Sand Cat swarm optimization: a nature-inspired algorithm to solve global optimization problems
2023
This study proposes a new metaheuristic algorithm called sand cat swarm optimization (SCSO) which mimics the sand cat behavior that tries to survive in nature. These cats are able to detect low frequencies below 2 kHz and also have an incredible ability to dig for prey. The proposed algorithm, inspired by these two features, consists of two main phases (search and attack). This algorithm controls the transitions in the exploration and exploitation phases in a balanced manner and performed well in finding good solutions with fewer parameters and operations. It is carried out by finding the direction and speed of the appropriate movements with the defined adaptive strategy. The SCSO algorithm is tested with 20 well-known along with modern 10 complex test functions of CEC2019 benchmark functions and the obtained results are also compared with famous metaheuristic algorithms. According to the results, the algorithm that found the best solution in 63.3% of the test functions is SCSO. Moreover, the SCSO algorithm is applied to seven challenging engineering design problems such as welded beam design, tension/compression spring design, pressure vessel design, piston lever, speed reducer design, three-bar truss design, and cantilever beam design. The obtained results show that the SCSO performs successfully on convergence rate and in locating all or most of the local/global optima and outperforms other compared methods.
Journal Article
Advances in Sparrow Search Algorithm: A Comprehensive Survey
by
Gharehchopogh, Farhad Soleimanian
,
Namazi, Mohammad
,
Abdollahzadeh, Benyamin
in
Algorithms
,
Artificial neural networks
,
Engineering
2023
Mathematical programming and meta-heuristics are two types of optimization methods. Meta-heuristic algorithms can identify optimal/near-optimal solutions by mimicking natural behaviours or occurrences and provide benefits such as simplicity of execution, a few parameters, avoidance of local optimization, and flexibility. Many meta-heuristic algorithms have been introduced to solve optimization issues, each of which has advantages and disadvantages. Studies and research on presented meta-heuristic algorithms in prestigious journals showed they had good performance in solving hybrid, improved and mutated problems. This paper reviews the sparrow search algorithm (SSA), one of the new and robust algorithms for solving optimization problems. This paper covers all the SSA literature on variants, improvement, hybridization, and optimization. According to studies, the use of SSA in the mentioned areas has been equal to 32%, 36%, 4%, and 28%, respectively. The highest percentage belongs to Improved, which has been analyzed by three subsections: Meat-Heuristics, artificial neural networks, and Deep Learning.
Journal Article
Atomistic Line Graph Neural Network for improved materials property predictions
2021
Graph neural networks (GNN) have been shown to provide substantial performance improvements for atomistic material representation and modeling compared with descriptor-based machine learning models. While most existing GNN models for atomistic predictions are based on atomic distance information, they do not explicitly incorporate bond angles, which are critical for distinguishing many atomic structures. Furthermore, many material properties are known to be sensitive to slight changes in bond angles. We present an Atomistic Line Graph Neural Network (ALIGNN), a GNN architecture that performs message passing on both the interatomic bond graph and its line graph corresponding to bond angles. We demonstrate that angle information can be explicitly and efficiently included, leading to improved performance on multiple atomistic prediction tasks. We ALIGNN models for predicting 52 solid-state and molecular properties available in the JARVIS-DFT, Materials project, and QM9 databases. ALIGNN can outperform some previously reported GNN models on atomistic prediction tasks with better or comparable model training speed.
Journal Article