Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Kernel Regularized Least Squares: Reducing Misspecification Bias with a Flexible and Interpretable Machine Learning Approach
by
Hainmueller, Jens
, Hazlett, Chad
in
Artificial intelligence
/ Classification
/ Estimating techniques
/ Estimators
/ Generalized linear models
/ Genocide
/ Inference
/ Linear analysis
/ Linear regression
/ Machine learning
/ Modeling
/ Normality
/ Partial derivatives
/ Regression analysis
/ Simulation
/ Social sciences
/ Standard error
/ Statistical discrepancies
/ Unbiased estimators
2014
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Kernel Regularized Least Squares: Reducing Misspecification Bias with a Flexible and Interpretable Machine Learning Approach
by
Hainmueller, Jens
, Hazlett, Chad
in
Artificial intelligence
/ Classification
/ Estimating techniques
/ Estimators
/ Generalized linear models
/ Genocide
/ Inference
/ Linear analysis
/ Linear regression
/ Machine learning
/ Modeling
/ Normality
/ Partial derivatives
/ Regression analysis
/ Simulation
/ Social sciences
/ Standard error
/ Statistical discrepancies
/ Unbiased estimators
2014
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Kernel Regularized Least Squares: Reducing Misspecification Bias with a Flexible and Interpretable Machine Learning Approach
by
Hainmueller, Jens
, Hazlett, Chad
in
Artificial intelligence
/ Classification
/ Estimating techniques
/ Estimators
/ Generalized linear models
/ Genocide
/ Inference
/ Linear analysis
/ Linear regression
/ Machine learning
/ Modeling
/ Normality
/ Partial derivatives
/ Regression analysis
/ Simulation
/ Social sciences
/ Standard error
/ Statistical discrepancies
/ Unbiased estimators
2014
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Kernel Regularized Least Squares: Reducing Misspecification Bias with a Flexible and Interpretable Machine Learning Approach
Journal Article
Kernel Regularized Least Squares: Reducing Misspecification Bias with a Flexible and Interpretable Machine Learning Approach
2014
Request Book From Autostore
and Choose the Collection Method
Overview
We propose the use of Kernel Regularized Least Squares (KRLS) for social science modeling and inference problems. KRLS borrows from machine learning methods designed to solve regression and classification problems without relying on linearity or additivity assumptions. The method constructs a flexible hypothesis space that uses kernels as radial basis functions and finds the best-fitting surface in this space by minimizing a complexity-penalized least squares problem. We argue that the method is well-suited for social science inquiry because it avoids strong parametric assumptions, yet allows interpretation in ways analogous to generalized linear models while also permitting more complex interpretation to examine nonlinearities, interactions, and heterogeneous effects. We also extend the method in several directions to make it more effective for social inquiry, by (1) deriving estimators for the pointwise marginal effects and their variances, (2) establishing unbiasedness, consistency, and asymptotic normality of the KRLS estimator under fairly general conditions, (3) proposing a simple automated rule for choosing the kernel bandwidth, and (4) providing companion software. We illustrate the use of the method through simulations and empirical examples.
This website uses cookies to ensure you get the best experience on our website.