Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Structured Prediction: Statistical and Computational Guarantees in Learning and Inference
by
Bello, Kevin
in
Algebra
/ Artificial intelligence
/ Connectivity
/ Graphs
/ Machine learning
/ Mathematics
/ Product reviews
2021
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Structured Prediction: Statistical and Computational Guarantees in Learning and Inference
by
Bello, Kevin
in
Algebra
/ Artificial intelligence
/ Connectivity
/ Graphs
/ Machine learning
/ Mathematics
/ Product reviews
2021
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Structured Prediction: Statistical and Computational Guarantees in Learning and Inference
Dissertation
Structured Prediction: Statistical and Computational Guarantees in Learning and Inference
2021
Request Book From Autostore
and Choose the Collection Method
Overview
Structured prediction consists of receiving a structured input and producing a combinatorial structure such as trees, clusters, networks, sequences, permutations, among others. From the computational viewpoint, structured prediction is in general considered intractable because of the size of the output space being exponential in the input size. For instance, in image segmentation tasks, the number of admissible segments is exponential in the number of pixels. A second factor is the combination of the input dimensionality along with the amount of data under availability. In structured prediction it is common to have the input live in a high-dimensional space, which involves to jointly reason about thousands or millions of variables, and at the same time contend with limited amount of data. Thus, learning and inference methods with strong computational and statistical guarantees are desired. The focus of our research is then to propose principled methods for structured prediction that are both polynomial time, i.e., computationally efficient, and require a polynomial number of data samples, i.e., statistically efficient.The main contributions of this thesis are as follows:i. We develop an efficient and principledlearning method of latent variable models for structured prediction under Gaussian perturbations. We derive a Rademacher-based generalization bound and argue that the use of non-convex formulations in learning latent-variable models leads to tighter bounds of the Gibbs decoder distortion.ii. We study the fundamental limits of structured prediction, i.e., we characterize the necessary sample complexity for learning factor graph models in the context of structured prediction. In particular, we show that the finiteness of our novelpair-dimension is necessary for learning. Lastly, we show a connection between the pair-dimension and the VC-dimension—which allows for using existing results on VC-dimension to calculate the pair-dimension.iii. We analyze a generative model based on connected graphs, and find the structural conditions of the graph that allow for the exact recovery of the node labels. In particular, we show that exact recovery is realizable in polynomial time for a large class of graphs.Our analysis is based on convex relaxations, where we thoroughly analyze a semidefinite program and a degree-4 sum-of-squares program. Finally, we extend this model to consider linear constraints (e.g., fairness), and formally explain the effect of the added constraints on the probability of exact recovery.
Publisher
ProQuest Dissertations & Theses
Subject
ISBN
9798379836092
This website uses cookies to ensure you get the best experience on our website.