Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Dualize, Split, Randomize: Toward Fast Nonsmooth Optimization Algorithms
by
Richtárik, Peter
, Condat, Laurent
, Mishchenko, Konstantin
, Salim, Adil
in
Algorithms
/ Convergence
/ Convexity
/ Image processing
/ Linear operators
/ Machine learning
/ Operators (mathematics)
/ Optimization
2022
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Dualize, Split, Randomize: Toward Fast Nonsmooth Optimization Algorithms
by
Richtárik, Peter
, Condat, Laurent
, Mishchenko, Konstantin
, Salim, Adil
in
Algorithms
/ Convergence
/ Convexity
/ Image processing
/ Linear operators
/ Machine learning
/ Operators (mathematics)
/ Optimization
2022
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Dualize, Split, Randomize: Toward Fast Nonsmooth Optimization Algorithms
Journal Article
Dualize, Split, Randomize: Toward Fast Nonsmooth Optimization Algorithms
2022
Request Book From Autostore
and Choose the Collection Method
Overview
We consider minimizing the sum of three convex functions, where the first one F is smooth, the second one is nonsmooth and proximable and the third one is the composition of a nonsmooth proximable function with a linear operator L. This template problem has many applications, for instance, in image processing and machine learning. First, we propose a new primal–dual algorithm, which we call PDDY, for this problem. It is constructed by applying Davis–Yin splitting to a monotone inclusion in a primal–dual product space, where the operators are monotone under a specific metric depending on L. We show that three existing algorithms (the two forms of the Condat–Vũ algorithm and the PD3O algorithm) have the same structure, so that PDDY is the fourth missing link in this self-consistent class of primal–dual algorithms. This representation eases the convergence analysis: it allows us to derive sublinear convergence rates in general, and linear convergence results in presence of strong convexity. Moreover, within our broad and flexible analysis framework, we propose new stochastic generalizations of the algorithms, in which a variance-reduced random estimate of the gradient of F is used, instead of the true gradient. Furthermore, we obtain, as a special case of PDDY, a linearly converging algorithm for the minimization of a strongly convex function F under a linear constraint; we discuss its important application to decentralized optimization.
Publisher
Springer Nature B.V
Subject
This website uses cookies to ensure you get the best experience on our website.