Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Optimization Methods for Large-Scale Machine Learning
by
Nocedal, Jorge
, Curtis, Frank E.
, Bottou, Léon
in
algorithm complexity analysis
/ machine learning
/ MATHEMATICS AND COMPUTING
/ noise reduction methods
/ numerical optimization
/ second-order methods
/ stochastic gradient methods
/ SURVEY and REVIEW
2018
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Optimization Methods for Large-Scale Machine Learning
by
Nocedal, Jorge
, Curtis, Frank E.
, Bottou, Léon
in
algorithm complexity analysis
/ machine learning
/ MATHEMATICS AND COMPUTING
/ noise reduction methods
/ numerical optimization
/ second-order methods
/ stochastic gradient methods
/ SURVEY and REVIEW
2018
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Optimization Methods for Large-Scale Machine Learning
by
Nocedal, Jorge
, Curtis, Frank E.
, Bottou, Léon
in
algorithm complexity analysis
/ machine learning
/ MATHEMATICS AND COMPUTING
/ noise reduction methods
/ numerical optimization
/ second-order methods
/ stochastic gradient methods
/ SURVEY and REVIEW
2018
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Journal Article
Optimization Methods for Large-Scale Machine Learning
2018
Request Book From Autostore
and Choose the Collection Method
Overview
This paper provides a review and commentary on the past, present, and future of numerical optimization algorithms in the context of machine learning applications. Through case studies on text classification and the training of deep neural networks, we discuss how optimization problems arise in machine learning and what makes them challenging. A major theme of our study is that large-scale machine learning represents a distinctive setting in which the stochastic gradient (SG) method has traditionally played a central role while conventional gradient-based nonlinear optimization techniques typically falter. Based on this viewpoint, we present a comprehensive theory of a straightforward, yet versatile SG algorithm, discuss its practical behavior, and highlight opportunities for designing algorithms with improved performance. This leads to a discussion about the next generation of optimization methods for large-scale machine learning, including an investigation of two main streams of research on techniques that diminish noise in the stochastic directions and methods that make use of second-order derivative approximations.
Publisher
Society for Industrial and Applied Mathematics
This website uses cookies to ensure you get the best experience on our website.