Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
by
Hanzely Filip
, Richtárik, Peter
, Lin, Xiao
in
Convergence
/ Convex analysis
/ Convexity
/ Euclidean geometry
/ Optimization
2021
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
by
Hanzely Filip
, Richtárik, Peter
, Lin, Xiao
in
Convergence
/ Convex analysis
/ Convexity
/ Euclidean geometry
/ Optimization
2021
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
Journal Article
Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
2021
Request Book From Autostore
and Choose the Collection Method
Overview
We consider the problem of minimizing the sum of two convex functions: one is differentiable and relatively smooth with respect to a reference convex function, and the other can be nondifferentiable but simple to optimize. We investigate a triangle scaling property of the Bregman distance generated by the reference convex function and present accelerated Bregman proximal gradient (ABPG) methods that attain an O(k-γ) convergence rate, where γ∈(0,2] is the triangle scaling exponent (TSE) of the Bregman distance. For the Euclidean distance, we have γ=2 and recover the convergence rate of Nesterov’s accelerated gradient methods. For non-Euclidean Bregman distances, the TSE can be much smaller (say γ≤1), but we show that a relaxed definition of intrinsic TSE is always equal to 2. We exploit the intrinsic TSE to develop adaptive ABPG methods that converge much faster in practice. Although theoretical guarantees on a fast convergence rate seem to be out of reach in general, our methods obtain empirical O(k-2) rates in numerical experiments on several applications and provide posterior numerical certificates for the fast rates.
Publisher
Springer Nature B.V
Subject
This website uses cookies to ensure you get the best experience on our website.