Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
27,779
result(s) for
"numerical optimization"
Sort by:
A Riemannian Optimization Approach for Computing Low-Rank Solutions of Lyapunov Equations
2010
We propose a new framework based on optimization on manifolds to approximate the solution of a Lyapunov matrix equation by a low-rank matrix. The method minimizes the error on the Riemannian manifold of symmetric positive semidefinite matrices of fixed rank. We detail how objects from differential geometry, like the Riemannian gradient and Hessian, can be efficiently computed for this manifold. As a minimization algorithm we use the Riemannian trust-region method of [P.-A. Absil, C. Baker, and K. Gallivan, Found. Comput. Math., 7 (2007), pp. 303-330] based on a second-order model of the objective function on the manifold. Together with an efficient preconditioner, this method can find low-rank solutions with very little memory. We illustrate our results with numerical examples.
Journal Article
Multi-Population Artificial Bee Colony (MPABC) Algorithm for Numerical Optimization
2018
This paper aims to propose a variant artificial bee colony algorithm (ABC), called multi-population artificial bee colony (MPABC) algorithm so as to optimize numerical functions with single and/or multiple solutions where the global optimization can be achieved. In MPABC, the solution space (i.e., food source) is partitioned into some subspaces in which a subpopulation of bees are parallel produced and employed to search the local optimization. Among these local optimizations, the ones with highest adaptability are taken as the global optimizations in each iteration step, and the corresponding local solutions are thus the global solutions. With a reasonable partition of solution space, all the global optimization and all the associated global solutions can be found by using MPABC. This property can not be committed by the ABC, as the ABC is interested in finding the global optimization with one solution in one running time. In addition, the MPABC holds higher abilities on convergence speed and accuracy than the ABC. Some experiments were conducted with some numerical functions so as to validate such conclusions.
Journal Article
Optimization Methods for Large-Scale Machine Learning
by
Nocedal, Jorge
,
Curtis, Frank E.
,
Bottou, Léon
in
algorithm complexity analysis
,
machine learning
,
MATHEMATICS AND COMPUTING
2018
This paper provides a review and commentary on the past, present, and future of numerical optimization algorithms in the context of machine learning applications. Through case studies on text classification and the training of deep neural networks, we discuss how optimization problems arise in machine learning and what makes them challenging. A major theme of our study is that large-scale machine learning represents a distinctive setting in which the stochastic gradient (SG) method has traditionally played a central role while conventional gradient-based nonlinear optimization techniques typically falter. Based on this viewpoint, we present a comprehensive theory of a straightforward, yet versatile SG algorithm, discuss its practical behavior, and highlight opportunities for designing algorithms with improved performance. This leads to a discussion about the next generation of optimization methods for large-scale machine learning, including an investigation of two main streams of research on techniques that diminish noise in the stochastic directions and methods that make use of second-order derivative approximations.
Journal Article
Red-billed blue magpie optimizer: a novel metaheuristic algorithm for 2D/3D UAV path planning and engineering design problems
2024
Numerical optimization, Unmanned Aerial Vehicle (UAV) path planning, and engineering design problems are fundamental to the development of artificial intelligence. Traditional methods show limitations in dealing with these complex nonlinear models. To address these challenges, the swarm intelligence algorithm is introduced as a metaheuristic method and effectively implemented. However, existing technology exhibits drawbacks such as slow convergence speed, low precision, and poor robustness. In this paper, we propose a novel metaheuristic approach called the Red-billed Blue Magpie Optimizer (RBMO), inspired by the cooperative and efficient predation behaviors of red-billed blue magpies. The mathematical model of RBMO was established by simulating the searching, chasing, attacking prey, and food storage behaviors of the red-billed blue magpie. To demonstrate RBMO’s performance, we first conduct qualitative analyses through convergence behavior experiments. Next, RBMO’s numerical optimization capabilities are substantiated using CEC2014 (Dim = 10, 30, 50, and 100) and CEC2017 (Dim = 10, 30, 50, and 100) suites, consistently achieving the best Friedman mean rank. In UAV path planning applications (two-dimensional and three − dimensional), RBMO obtains preferable solutions, demonstrating its effectiveness in solving NP-hard problems. Additionally, in five engineering design problems, RBMO consistently yields the minimum cost, showcasing its advantage in practical problem-solving. We compare our experimental results with three categories of widely recognized algorithms: (1) advanced variants, (2) recently proposed algorithms, and (3) high-performance optimizers, including CEC winners.
Journal Article
An efficient hybrid algorithm based on Water Cycle and Moth-Flame Optimization algorithms for solving numerical and constrained engineering optimization problems
by
Khalilpourazari, Soheyl
,
Khalilpourazary, Saman
in
Algorithms
,
Artificial Intelligence
,
Benchmarks
2019
This paper proposes a hybrid algorithm based on Water Cycle and Moth-Flame Optimization algorithms for solving numerical and constrained engineering optimization problems. The spiral movement of moths in Moth-Flame Optimization algorithm is introduced into the Water Cycle Algorithm to enhance its exploitation ability. In addition, to increase randomization in the new hybrid method, the streams in the Water Cycle Algorithm are allowed to update their position using a random walk (Levy flight). The random walk significantly improves the exploration ability of the Water Cycle Algorithm. The performance of the new hybrid Water Cycle–Moth-Flame Optimization algorithm (WCMFO) is investigated in 23 benchmark functions such as unimodal, multimodal and fixed-dimension multimodal benchmark functions. The results of the WCMFO are compared to the other state-of-the-art metaheuristic algorithms. The results show that the hybrid method is able to outperform the other state-of-the-art metaheuristic algorithms in majority of the benchmark functions. To evaluate the efficiency of the WCMFO in solving complex constrained engineering and real-life problems, three well-known structural engineering problems are solved using WCMFO and the results are compared with the ones of the other metaheuristics in the literature. The results of the simulations revealed that the WCMFO is able to provide very competitive and promising results comparing to the other hybrid and metaheuristic algorithms.
Journal Article
Circle Search Algorithm: A Geometry-Based Metaheuristic Optimization Algorithm
by
Jurado, Francisco
,
Turky, Rania A.
,
Tostado-Véliz, Marcos
in
algorithms
,
Biology
,
circle search algorithm
2022
This paper presents a novel metaheuristic optimization algorithm inspired by the geometrical features of circles, called the circle search algorithm (CSA). The circle is the most well-known geometric object, with various features including diameter, center, perimeter, and tangent lines. The ratio between the radius and the tangent line segment is the orthogonal function of the angle opposite to the orthogonal radius. This angle plays an important role in the exploration and exploitation behavior of the CSA. To evaluate the robustness of the CSA in comparison to other algorithms, many independent experiments employing 23 famous functions and 3 real engineering problems were carried out. The statistical results revealed that the CSA succeeded in achieving the minimum fitness values for 21 out of the tested 23 functions, and the p-value was less than 0.05. The results evidence that the CSA converged to the minimum results faster than the comparative algorithms. Furthermore, high-dimensional functions were used to assess the CSA’s robustness, with statistical results revealing that the CSA is robust to high-dimensional problems. As a result, the proposed CSA is a promising algorithm that can be used to easily handle a wide range of optimization problems.
Journal Article
Mutual Coupling Reduction in Antenna Arrays Using Artificial Intelligence Approach and Inverse Neural Network Surrogates
by
Golunski, Lukasz
,
Chaudhary, Muhammad Akmal
,
Ghadi, Yazeed Yasin
in
Algorithms
,
Analysis
,
Antenna arrays
2023
This paper presents a novel approach to reducing undesirable coupling in antenna arrays using custom-designed resonators and inverse surrogate modeling. To illustrate the concept, two standard patch antenna cells with 0.07λ edge-to-edge distance were designed and fabricated to operate at 2.45 GHz. A stepped-impedance resonator was applied between the antennas to suppress their mutual coupling. For the first time, the optimum values of the resonator geometry parameters were obtained using the proposed inverse artificial neural network (ANN) model, constructed from the sampled EM-simulation data of the system, and trained using the particle swarm optimization (PSO) algorithm. The inverse ANN surrogate directly yields the optimum resonator dimensions based on the target values of its S-parameters being the input parameters of the model. The involvement of surrogate modeling also contributes to the acceleration of the design process, as the array does not need to undergo direct EM-driven optimization. The obtained results indicate a remarkable cancellation of the surface currents between two antennas at their operating frequency, which translates into isolation as high as −46.2 dB at 2.45 GHz, corresponding to over 37 dB improvement as compared to the conventional setup.
Journal Article
Optimal Control Computation for Nonlinear Fractional Time-Delay Systems with State Inequality Constraints
2021
In this paper, a numerical method is developed for solving a class of delay fractional optimal control problems involving nonlinear time-delay systems and subject to state inequality constraints. The fractional derivatives in this class of problems are described in the sense of Caputo, and they can be of different orders. First, we propose a numerical integration scheme for the fractional time-delay system and prove that the convergence rate of the numerical solution to the exact one is of second order based on Taylor expansion and linear interpolation. This gives rise to a discrete-time optimal control problem. Then, we derive the gradient formulas of the cost and constraint functions with respect to the decision variables and present a gradient computation procedure. On this basis, a gradient-based optimization algorithm is developed to solve the resulting discrete-time optimal control problem. Finally, several example problems are solved to demonstrate the effectiveness of the developed solution approach.
Journal Article
Theory and Applications of Robust Optimization
by
Bertsimas, Dimitris
,
Brown, David B.
,
Caramanis, Constantine
in
Algorithms
,
Analysis
,
Approximation
2011
In this paper we survey the primary research, both theoretical and applied, in the area of robust optimization (RO). Our focus is on the computational attractiveness of RO approaches, as well as the modeling power and broad applicability of the methodology. In addition to surveying prominent theoretical results of RO, we also present some recent results linking RO to adaptable models for multistage decision-making problems. Finally, we highlight applications of RO across a wide spectrum of domains, including finance, statistics, learning, and various areas of engineering.
Journal Article
Stock intelligent investment strategy based on support vector machine parameter optimization algorithm
by
Li, Xuetao
,
Sun, Yi
in
Artificial Intelligence
,
Computational Biology/Bioinformatics
,
Computational Science and Engineering
2020
The changes in China’s stock market are inseparable from the country’s economic development and macroeconomic regulation and control and have far-reaching significance in promoting China’s national economic growth. Compared with the Western developed capital market, China’s current stock market’s main smart investment strategy still has certain defects. Based on the SVM model, this paper establishes a predictive model that combines kernel parameters and parameter optimization to model. The mesh search method, genetic algorithm, and particle swarm optimization algorithm are used to optimize the parameters of the SVM under various kernel functions such as radial basis kernel function. The algorithm and particle swarm optimization algorithm optimize the parameters of the SVM to strengthen the applicability of the model in practice. The empirical results show that under the three-parameter optimization algorithms, the prediction results are higher than the random prediction accuracy, which indicates that it is effective to optimize the model by adjusting the parameters of the SVM. Among them, the SVM using the genetic algorithm parameter optimization under the radial basis kernel function shows the better prediction effect, which is the closest to the real value in the stock market forecast. The particle swarm algorithm supports the vector machine to predict the effect is slightly lower than the grid. Search method. In addition, through comparison experiments, the guess accuracy of BP neural network is worse than that of the support vector machine model before the adjustment. Finally, this paper uses the well-trained model to plan the stock smart investment plan.
Journal Article