Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Principled Algorithms for Finding Local Minima
by
Hinder, Oliver
in
Computer science
2019
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Do you wish to request the book?
Principled Algorithms for Finding Local Minima
by
Hinder, Oliver
in
Computer science
2019
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Dissertation
Principled Algorithms for Finding Local Minima
2019
Request Book From Autostore
and Choose the Collection Method
Overview
Convex optimization is the cornerstone of continuous optimization, but many real problems arising in machine learning and operations research are nonconvex. This two part thesis explores my work developing principled algorithms for finding local minima of nonconvex functions.Part I: The complexity of finding stationary points of nonconvex functions. For a long time, it was known that gradient descent achieved an ε-2 rate for finding stationary points of unconstrained nonconvex functions, but better rates for first-order methods were unknown. We show that, using an algorithm that judiciously utilizes Nesterov's accelerated gradient descent, it is possible to improve this rate to ε-7/4 rate for functions with Lipschitz first and second derivatives. Adding Lipschitz third derivatives improves this rate to ε-5/3. Moreover, we provide almost matching lower bounds to prove that (i) finding a stationary point is easier for convex functions and (ii) acceleration in nonconvex optimization requires assumptions beyond smoothness.Part II: Interior point methods for nonconvex optimization. The second part of this thesis focuses on an interior point method (IPM) for optimization problems with nonconvex constraints. The work of Wachter and Biegler suggests that infeasible-start interior point methods (IPMs) developed for linear programming cannot be adapted to nonlinear optimization without significant modification, i.e., using a two-phase or penalty method. We propose an IPM that, by careful initialization and updates of the slack variables, is guaranteed to find a first-order certificate of local infeasibility, local optimality or unboundedness of the (shifted) feasible region. Our proposed algorithm differs from other IPM methods for nonconvex programming because we reduce primal feasibility at the same rate as the barrier parameter. This gives an algorithm with more robust convergence properties and closely resembles successful algorithms from linear programming. Comparisons with IPOPT on a subset of CUTEst problems indicate less frequent failures and superior performance for detecting infeasibility. Finally, we develop a highly simplified version of our IPM method with nonconvex constraints that obtains an ε-scaled KKT point in ε-7/4 iterations. This provides the first proof that IPMs with nonconvex constraints have a polynomial runtime in 1/ε.
This website uses cookies to ensure you get the best experience on our website.