Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Practical Algorithms for Learning Near-Isometric Linear Embeddings
by
Hao-Jun, Michael Shi
, Luo, Jerry
, Yang, Qi
, Zhu, Kan
, Shapiro, Kayla
in
Algorithms
/ Convergence
/ Data points
/ Machine learning
/ Mapping
/ Mathematical analysis
/ Matrix methods
/ Norms
/ Signal processing
2016
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Practical Algorithms for Learning Near-Isometric Linear Embeddings
by
Hao-Jun, Michael Shi
, Luo, Jerry
, Yang, Qi
, Zhu, Kan
, Shapiro, Kayla
in
Algorithms
/ Convergence
/ Data points
/ Machine learning
/ Mapping
/ Mathematical analysis
/ Matrix methods
/ Norms
/ Signal processing
2016
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Practical Algorithms for Learning Near-Isometric Linear Embeddings
Paper
Practical Algorithms for Learning Near-Isometric Linear Embeddings
2016
Request Book From Autostore
and Choose the Collection Method
Overview
We propose two practical non-convex approaches for learning near-isometric, linear embeddings of finite sets of data points. Given a set of training points \\(\\mathcal{X}\\), we consider the secant set \\(S(\\mathcal{X})\\) that consists of all pairwise difference vectors of \\(\\mathcal{X}\\), normalized to lie on the unit sphere. The problem can be formulated as finding a symmetric and positive semi-definite matrix \\(\\boldsymbol{\\Psi}\\) that preserves the norms of all the vectors in \\(S(\\mathcal{X})\\) up to a distortion parameter \\(\\delta\\). Motivated by non-negative matrix factorization, we reformulate our problem into a Frobenius norm minimization problem, which is solved by the Alternating Direction Method of Multipliers (ADMM) and develop an algorithm, FroMax. Another method solves for a projection matrix \\(\\boldsymbol{\\Psi}\\) by minimizing the restricted isometry property (RIP) directly over the set of symmetric, postive semi-definite matrices. Applying ADMM and a Moreau decomposition on a proximal mapping, we develop another algorithm, NILE-Pro, for dimensionality reduction. FroMax is shown to converge faster for smaller \\(\\delta\\) while NILE-Pro converges faster for larger \\(\\delta\\). Both non-convex approaches are then empirically demonstrated to be more computationally efficient than prior convex approaches for a number of applications in machine learning and signal processing.
Publisher
Cornell University Library, arXiv.org
Subject
This website uses cookies to ensure you get the best experience on our website.