Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Efficient computation of expected hypervolume improvement using box decomposition algorithms
by
Emmerich, Michael
, Yang, Kaifeng
, Deutz, André
, Bäck, Thomas
in
Bayesian analysis
/ Criteria
/ Decomposition
/ Evolutionary algorithms
/ Gaussian process
/ Global optimization
/ Multiple objective analysis
/ Optimization algorithms
/ Upper bounds
2019
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Efficient computation of expected hypervolume improvement using box decomposition algorithms
by
Emmerich, Michael
, Yang, Kaifeng
, Deutz, André
, Bäck, Thomas
in
Bayesian analysis
/ Criteria
/ Decomposition
/ Evolutionary algorithms
/ Gaussian process
/ Global optimization
/ Multiple objective analysis
/ Optimization algorithms
/ Upper bounds
2019
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Efficient computation of expected hypervolume improvement using box decomposition algorithms
by
Emmerich, Michael
, Yang, Kaifeng
, Deutz, André
, Bäck, Thomas
in
Bayesian analysis
/ Criteria
/ Decomposition
/ Evolutionary algorithms
/ Gaussian process
/ Global optimization
/ Multiple objective analysis
/ Optimization algorithms
/ Upper bounds
2019
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Efficient computation of expected hypervolume improvement using box decomposition algorithms
Journal Article
Efficient computation of expected hypervolume improvement using box decomposition algorithms
2019
Request Book From Autostore
and Choose the Collection Method
Overview
In the field of multi-objective optimization algorithms, multi-objective Bayesian Global Optimization (MOBGO) is an important branch, in addition to evolutionary multi-objective optimization algorithms. MOBGO utilizes Gaussian Process models learned from previous objective function evaluations to decide the next evaluation site by maximizing or minimizing an infill criterion. A commonly used criterion in MOBGO is the Expected Hypervolume Improvement (EHVI), which shows a good performance on a wide range of problems, with respect to exploration and exploitation. However, so far, it has been a challenge to calculate exact EHVI values efficiently. This paper proposes an efficient algorithm for the exact calculation of the EHVI for in a generic case. This efficient algorithm is based on partitioning the integration volume into a set of axis-parallel slices. Theoretically, the upper bound time complexities can be improved from previously \\[O (n^2)\\] and \\[O(n^3)\\], for two- and three-objective problems respectively, to \\[\\varTheta (n\\log n)\\], which is asymptotically optimal. This article generalizes the scheme in higher dimensional cases by utilizing a new hyperbox decomposition technique, which is proposed by Dächert et al. (Eur J Oper Res 260(3):841–855, 2017). It also utilizes a generalization of the multilayered integration scheme that scales linearly in the number of hyperboxes of the decomposition. The speed comparison shows that the proposed algorithm in this paper significantly reduces computation time. Finally, this decomposition technique is applied in the calculation of the Probability of Improvement (PoI).
Publisher
Springer Nature B.V
This website uses cookies to ensure you get the best experience on our website.