Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
A Stochastic Block-coordinate Proximal Newton Method for Nonconvex Composite Minimization
by
Zhu, Hong
, Qian, Xun
in
Algorithms
/ Convergence
/ Mapping
/ Newton methods
/ Optimization
/ Sampling
2024
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
A Stochastic Block-coordinate Proximal Newton Method for Nonconvex Composite Minimization
by
Zhu, Hong
, Qian, Xun
in
Algorithms
/ Convergence
/ Mapping
/ Newton methods
/ Optimization
/ Sampling
2024
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
A Stochastic Block-coordinate Proximal Newton Method for Nonconvex Composite Minimization
Paper
A Stochastic Block-coordinate Proximal Newton Method for Nonconvex Composite Minimization
2024
Request Book From Autostore
and Choose the Collection Method
Overview
We propose a stochastic block-coordinate proximal Newton method for minimizing the sum of a blockwise Lipschitz-continuously differentiable function and a separable nonsmooth convex function. In each iteration, this method randomly selects a block and approximately solves a strongly convex regularized quadratic subproblem, utilizing second-order information from the smooth component of the objective function. A backtracking line search is employed to ensure the monotonicity of the objective value. We demonstrate that under certain sampling assumption, the fundamental convergence results of our proposed stochastic method are in accordance with the corresponding results for the inexact proximal Newton method. We study the convergence of the sequence of expected objective values and the convergence of the sequence of expected residual mapping norms under various sampling assumptions. Furthermore, we introduce a method that employs the unit step size in conjunction with the Lipschitz constant of the gradient of the smooth component to formulate the strongly convex regularized quadratic subproblem. In addition to establishing the global convergence rate, we also provide a local convergence analysis for this method under certain sampling assumption and the higher-order metric subregularity of the residual mapping. To the best knowledge of the authors, this is the first stochastic second-order algorithm with a superlinear local convergence rate for addressing nonconvex composite optimization problems. Finally, we conduct numerical experiments to demonstrate the effectiveness and convergence of the proposed algorithm.
Publisher
Cornell University Library, arXiv.org
Subject
This website uses cookies to ensure you get the best experience on our website.