Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
MODEL SELECTION FOR HIGH-DIMENSIONAL LINEAR REGRESSION WITH DEPENDENT OBSERVATIONS
by
Ing, Ching-Kang
in
Convergence
/ Greedy algorithms
/ Knowledge
/ Linear equations
/ Regression analysis
/ Regression models
/ Sparsity
/ Studies
2020
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
MODEL SELECTION FOR HIGH-DIMENSIONAL LINEAR REGRESSION WITH DEPENDENT OBSERVATIONS
by
Ing, Ching-Kang
in
Convergence
/ Greedy algorithms
/ Knowledge
/ Linear equations
/ Regression analysis
/ Regression models
/ Sparsity
/ Studies
2020
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
MODEL SELECTION FOR HIGH-DIMENSIONAL LINEAR REGRESSION WITH DEPENDENT OBSERVATIONS
Journal Article
MODEL SELECTION FOR HIGH-DIMENSIONAL LINEAR REGRESSION WITH DEPENDENT OBSERVATIONS
2020
Request Book From Autostore
and Choose the Collection Method
Overview
We investigate the prediction capability of the orthogonal greedy algorithm (OGA) in high-dimensional regression models with dependent observations. The rates of convergence of the prediction error of OGA are obtained under a variety of sparsity conditions. To prevent OGA from overfitting, we introduce a high-dimensional Akaike’s information criterion (HDAIC) to determine the number of OGA iterations. A key contribution of this work is to show that OGA, used in conjunction with HDAIC, can achieve the optimal convergence rate without knowledge of how sparse the underlying high-dimensional model is.
Publisher
Institute of Mathematical Statistics
Subject
This website uses cookies to ensure you get the best experience on our website.