Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
LanguageLanguage
-
SubjectSubject
-
Item TypeItem Type
-
DisciplineDiscipline
-
YearFrom:-To:
-
More FiltersMore FiltersIs Peer Reviewed
Done
Filters
Reset
85,800
result(s) for
"MATHEMATICS / Optimization"
Sort by:
Analytical Evaluation of Uncertainty Propagation for Probabilistic Design Optimisation
by
Kuang, Ye Chow
,
Demidenko, Serge
,
Ooi, Melanie Po-Leen
in
Mathematical optimization
,
System design
2023
This book presents a novel approach to the evaluation and dealing with uncertainties of parameters and processes in technical systems with application to the probabilistic optimisation of engineering design while achieving the required high levels of efficiency, robustness, and reliability.
Self-Regularity
by
Terlaky, Tamás
,
Roos, Cornelis
,
Peng, Jiming
in
Algorithm
,
Analysis of algorithms
,
Analytic function
2009,2002,2003
Research on interior-point methods (IPMs) has dominated the field of mathematical programming for the last two decades. Two contrasting approaches in the analysis and implementation of IPMs are the so-called small-update and large-update methods, although, until now, there has been a notorious gap between the theory and practical performance of these two strategies. This book comes close to bridging that gap, presenting a new framework for the theory of primal-dual IPMs based on the notion of the self-regularity of a function.
The authors deal with linear optimization, nonlinear complementarity problems, semidefinite optimization, and second-order conic optimization problems. The framework also covers large classes of linear complementarity problems and convex optimization. The algorithm considered can be interpreted as a path-following method or a potential reduction method. Starting from a primal-dual strictly feasible point, the algorithm chooses a search direction defined by some Newton-type system derived from the self-regular proximity. The iterate is then updated, with the iterates staying in a certain neighborhood of the central path until an approximate solution to the problem is found. By extensively exploring some intriguing properties of self-regular functions, the authors establish that the complexity of large-update IPMs can come arbitrarily close to the best known iteration bounds of IPMs.
Researchers and postgraduate students in all areas of linear and nonlinear optimization will find this book an important and invaluable aid to their work.
Modern optimization methods for science, engineering and technology
by
Chamorshikar, Rajesh
,
Desai, Santosh R
,
Sinha, G. R
in
Mathematical optimization
,
MATHEMATICS
,
Operations research
2020
Achieving a better solution or improving the performance of existing system design is an ongoing a process for which scientists, engineers, mathematicians and researchers have been striving for many years. Ever increasingly practical and robust methods have been developed, and every new generation of computers with their increased power and speed allows for the development and wider application of new types of solutions. This book defines the fundamentals, background and theoretical concepts of optimization principles in a comprehensive manner along with their potential applications and implementation strategies. It encompasses linear programming, multivariable methods for risk assessment, nonlinear methods, ant colony optimization, particle swarm optimization, multi-criterion and topology optimization, learning classifier, case studies on six sigma, performance measures and evaluation, multi-objective optimization problems, machine learning approaches, genetic algorithms and quality of service optimizations. The book will be very useful for wide spectrum of target readers including students and researchers in academia and industry.
Optimization algorithms on matrix manifolds
2008
Many problems in the sciences and engineering can be rephrased as optimization problems on matrix search spaces endowed with a so-called manifold structure. This book shows how to exploit the special structure of such problems to develop efficient numerical algorithms. It places careful emphasis on both the numerical formulation of the algorithm and its differential geometric abstraction--illustrating how good algorithms draw equally from the insights of differential geometry, optimization, and numerical analysis. Two more theoretical chapters provide readers with the background in differential geometry necessary to algorithmic development. In the other chapters, several well-known optimization methods such as steepest descent and conjugate gradients are generalized to abstract manifolds. The book provides a generic development of each of these methods, building upon the material of the geometric chapters. It then guides readers through the calculations that turn these geometrically formulated methods into concrete numerical algorithms. The state-of-the-art algorithms given as examples are competitive with the best existing algorithms for a selection of eigenspace problems in numerical linear algebra.
Stochastic global optimization
by
Rangaiah, Gade Pandu
in
Chemical Engineering
,
Chemical processes
,
Industrial and Systems Engineering
2010
Optimization has played a key role in the design, planning and operation of chemical and related processes, for several decades. Global optimization has been receiving considerable attention in the past two decades. Of the two types of techniques for global optimization, stochastic global optimization is applicable to any type of problems having non-differentiable functions, discrete variables and/or continuous variables. It, thus, shows significant promise and potential for process optimization.
Topological Optimization and Optimal Transport
by
Champion, Thierry
,
Bergounioux, Maïtine
,
Rumpf, Martin
in
Bildverarbeitung
,
COM016000 COMPUTERS / Computer Vision & Pattern Recognition
,
COMPUTERS / Programming / Algorithms
2017
By discussing topics such as shape representations, relaxation theory and optimal transport, trends and synergies of mathematical tools required for optimization of geometry and topology of shapes are explored.
Robust Optimization
by
Nemirovski, Arkadi
,
El Ghaoui, Laurent
,
Ben-Tal, Aharon
in
Accuracy and precision
,
Additive model
,
Almost surely
2009
Robust optimization is still a relatively new approach to optimization problems affected by uncertainty, but it has already proved so useful in real applications that it is difficult to tackle such problems today without considering this powerful methodology. Written by the principal developers of robust optimization, and describing the main achievements of a decade of research, this is the first book to provide a comprehensive and up-to-date account of the subject.
Robust optimization is designed to meet some major challenges associated with uncertainty-affected optimization problems: to operate under lack of full information on the nature of uncertainty; to model the problem in a form that can be solved efficiently; and to provide guarantees about the performance of the solution.
The book starts with a relatively simple treatment of uncertain linear programming, proceeding with a deep analysis of the interconnections between the construction of appropriate uncertainty sets and the classical chance constraints (probabilistic) approach. It then develops the robust optimization theory for uncertain conic quadratic and semidefinite optimization problems and dynamic (multistage) problems. The theory is supported by numerous examples and computational illustrations.
An essential book for anyone working on optimization and decision making under uncertainty,Robust Optimizationalso makes an ideal graduate textbook on the subject.
Fuzzy Multi-Criteria Decision Making
2008
In summarizing the concepts and results of the most popular fuzzy multicriteria methods, using numerical examples, this work examines all the most recently developed methods. Each one of the 22 chapters include practical applications along with new results.
Automated tight Lyapunov analysis for first-order methods
2025
We present a methodology for establishing the existence of quadratic Lyapunov inequalities for a wide range of first-order methods used to solve convex optimization problems. In particular, we consider (i) classes of optimization problems of finite-sum form with (possibly strongly) convex and possibly smooth functional components, (ii) first-order methods that can be written as a linear system on state-space form in feedback interconnection with the subdifferentials of the functional components of the objective function, and (iii) quadratic Lyapunov inequalities that can be used to draw convergence conclusions. We present a necessary and sufficient condition for the existence of a quadratic Lyapunov inequality within a predefined class of Lyapunov inequalities, which amounts to solving a small-sized semidefinite program. We showcase our methodology on several first-order methods that fit the framework. Most notably, our methodology allows us to significantly extend the region of parameter choices that allow for duality gap convergence in the Chambolle-Pock method when the linear operator is the identity mapping.
On Sudakov’s type decomposition of transference plans with norm costs
2018
We consider the original strategy proposed by Sudakov for solving the Monge transportation problem with norm cost
In this paper we show
how these difficulties can be overcome, and that the original idea of Sudakov can be successfully implemented.
The results yield
a complete characterization of the Kantorovich optimal transportation problem, whose straightforward corollary is the solution of the
Monge problem in each set
The analysis requires
(1)
(2)
(3)
(4)