Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
230,600
result(s) for
"dynamic model"
Sort by:
An Introductory Guide to Event Study Models
2023
The event study model is a powerful econometric tool used for the purpose of estimating dynamic treatment effects. One of its most appealing features is that it provides a built-in graphical summary of results, which can reveal rich patterns of behavior. Another value of the picture is the estimated pre-event pseudo-\"effects\", which provide a type of placebo test. In this essay I aim to provide a framework for a shared understanding of these models. There are several (sometimes subtle) decisions and choices faced by users of these models, and I offer guidance for these decisions.
Journal Article
WHY YOU SHOULD NEVER USE THE HODRICK-PRESCOTT FILTER
2018
Here’s why. (a) The Hodrick-Prescott (HP) filter introduces spurious dynamic relations that have no basis in the underlying data-generating process. (b) Filtered values at the end of the sample are very different from those in the middle and are also characterized by spurious dynamics. (c) A statistical formalization of the problem typically produces values for the smoothing parameter vastly at odds with common practice. (d) There is a better alternative. A regression of the variable at date t on the four most recent values as of date t − h achieves all the objectives sought by users of the HP filter with none of its drawbacks.
Journal Article
HAR Inference: Recommendations for Practice
by
Lewis, Daniel J.
,
Lazarus, Eben
,
Watson, Mark W.
in
Heteroscedasticity- and autocorrelation-robust estimation
,
Long-run variance
,
Time series
2018
The classic papers by Newey and West (1987) and Andrews (1991) spurred a large body of work on how to improve heteroscedasticity- and autocorrelation-robust (HAR) inference in time series regression. This literature finds that using a larger-than-usual truncation parameter to estimate the long-run variance, combined with Kiefer-Vogelsang (2002, 2005) fixed-b critical values, can substantially reduce size distortions, at only a modest cost in (size-adjusted) power. Empirical practice, however, has not kept up. This article therefore draws on the post-Newey West/Andrews literature to make concrete recommendations for HAR inference. We derive truncation parameter rules that choose a point on the size-power tradeoff to minimize a loss function. If Newey-West tests are used, we recommend the truncation parameter rule S = 1.3T
1/2
and (nonstandard) fixed-b critical values. For tests of a single restriction, we find advantages to using the equal-weighted cosine (EWC) test, where the long run variance is estimated by projections onto Type II cosines, using ν = 0.4T
2/3
cosine terms; for this test, fixed-b critical values are, conveniently, t
ν
or F. We assess these rules using first an ARMA/GARCH Monte Carlo design, then a dynamic factor model design estimated using a 207 quarterly U.S. macroeconomic time series.
Journal Article
Quantile Regression: 40 Years On
2017
Since Quetelet's work in the nineteenth century, social science has iconified the average man, that hypothetical man without qualities who is comfortable with his head in the oven and his feet in a bucket of ice. Conventional statistical methods since Quetelet have sought to estimate the effects of policy treatments for this average man. However, such effects are often quite heterogeneous: Medical treatments may improve life expectancy but also impose serious short-term risks; reducing class sizes may improve the performance of good students but not help weaker ones, or vice versa. Quantile regression methods can help to explore these heterogeneous effects. Some recent developments in quantile regression methods are surveyed in this review.
Journal Article
GENERALIZED AUTOREGRESSIVE SCORE MODELS WITH APPLICATIONS
by
Koopman, Siem Jan
,
Lucas, André
,
Creal, Drew
in
Classroom observation
,
Copulas
,
Econometric models
2013
We propose a class of observation-driven time series models referred to as generalized autoregressive score (GAS) models. The mechanism to update the parameters over time is the scaled score of the likelihood function. This new approach provides a unified and consistent framework for introducing time-varying parameters in a wide class of nonlinear models. The GAS model encompasses other well-known models such as the generalized autoregressive conditional heteroskedasticity, autoregressive conditional duration, autoregressive conditional intensity, and Poisson count models with time-varying mean. In addition, our approach can lead to new formulations of observation-driven models. We illustrate our framework by introducing new model specifications for time-varying copula functions and for multi variate point processes with time-vary ing parameters. We study the models in detail and provide simulation and empirical evidence.
Journal Article
Robust Bayesian Inference via Coarsening
2019
The standard approach to Bayesian inference is based on the assumption that the distribution of the data belongs to the chosen model class. However, even a small violation of this assumption can have a large impact on the outcome of a Bayesian procedure. We introduce a novel approach to Bayesian inference that improves robustness to small departures from the model: rather than conditioning on the event that the observed data are generated by the model, one conditions on the event that the model generates data close to the observed data, in a distributional sense. When closeness is defined in terms of relative entropy, the resulting \"coarsened\" posterior can be approximated by simply tempering the likelihood-that is, by raising the likelihood to a fractional power-thus, inference can usually be implemented via standard algorithms, and one can even obtain analytical solutions when using conjugate priors. Some theoretical properties are derived, and we illustrate the approach with real and simulated data using mixture models and autoregressive models of unknown order. Supplementary materials for this article are available online.
Journal Article
Realized GARCH: a joint model for returns and realized measures of volatility
by
Hansen, Peter Reinhard
,
Shek, Howard Howan
,
Huang, Zhuo
in
Economic models
,
Forecasting models
,
GARCH models
2012
We introduce a new framework, Realized GARCH, for the joint modeling of returns and realized measures of volatility. A key feature is a measurement equation that relates the realized measure to the conditional variance of returns. The measurement equation facilitates a simple modeling of the dependence between returns and future volatility. Realized GARCH models with a linear or log-linear specification have many attractive features. They are parsimonious, simple to estimate, and imply an ARMA structure for the conditional variance and the realized measure. An empirical application with Dow Jones Industrial Average stocks and an exchange traded index fund shows that a simple Realized GARCH structure leads to substantial improvements in the empirical fit over standard GARCH models that only use daily returns.
Journal Article
Using Synthetic Controls
2021
Probably because of their interpretability and transparent nature, synthetic controls have become widely applied in empirical research in economics and the social sciences. This article aims to provide practical guidance to researchers employing synthetic control methods. The article starts with an overview and an introduction to synthetic control estimation. The main sections discuss the advantages of the synthetic control framework as a research design, and describe the settings where synthetic controls provide reliable estimates and those where they may fail. The article closes with a discussion of recent extensions, related methods, and avenues for future research.
Journal Article
MAXIMUM LIKELIHOOD ESTIMATION AND INFERENCE FOR APPROXIMATE FACTOR MODELS OF HIGH DIMENSION
2016
An approximate factor model of high dimension has two key features. First, the idiosyncratic errors are correlated and heteroskedastic over both the cross-section and time dimensions; the correlations and heteroskedasticities are of unknown forms. Second, the number of variables is comparable or even greater than the sample size. Thus, a large number of parameters exist under a high-dimensional approximate factor model. Most widely used approaches to estimation are principal component based. This paper considers the maximum likelihood-based estimation of the model. Consistency, rate of convergence, and limiting distributions are obtained under various identification restrictions. Monte Carlo simulations show that the likelihood method is easy to implement and has good finite sample properties.
Journal Article