Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
20,413
result(s) for
"Diffusion process"
Sort by:
The Regularity of the Linear Drift in Negatively Curved Spaces
by
Shu, Lin
,
Ledrappier, François
in
Brownian motion processes
,
Curves, Algebraic
,
Geodesic flows
2023
We show that the linear drift of the Brownian motion on the universal cover of a closed connected smooth Riemannian manifold is
An Introductory Guide to Event Study Models
2023
The event study model is a powerful econometric tool used for the purpose of estimating dynamic treatment effects. One of its most appealing features is that it provides a built-in graphical summary of results, which can reveal rich patterns of behavior. Another value of the picture is the estimated pre-event pseudo-\"effects\", which provide a type of placebo test. In this essay I aim to provide a framework for a shared understanding of these models. There are several (sometimes subtle) decisions and choices faced by users of these models, and I offer guidance for these decisions.
Journal Article
WHY YOU SHOULD NEVER USE THE HODRICK-PRESCOTT FILTER
2018
Here’s why. (a) The Hodrick-Prescott (HP) filter introduces spurious dynamic relations that have no basis in the underlying data-generating process. (b) Filtered values at the end of the sample are very different from those in the middle and are also characterized by spurious dynamics. (c) A statistical formalization of the problem typically produces values for the smoothing parameter vastly at odds with common practice. (d) There is a better alternative. A regression of the variable at date t on the four most recent values as of date t − h achieves all the objectives sought by users of the HP filter with none of its drawbacks.
Journal Article
GLOBAL SOLVABILITY OF A NETWORKED INTEGRATE-AND-FIRE MODEL OF MCKEAN–VLASOV TYPE
2015
We here investigate the well-posedness of a networked integrate-and-fire model describing an infinite population of neurons which interact with one another through their common statistical distribution. The interaction is of the self-excitatory type as, at any time, the potential of a neuron increases when some of the others fire: precisely, the kick it receives is proportional to the instantaneous proportion of firing neurons at the same time. From a mathematical point of view, the coefficient of proportionality, denoted by α, is of great importance as the resulting system is known to blow-up for large values of α. In the current paper, we focus on the complementary regime and prove that existence and uniqueness hold for all time when α is small enough.
Journal Article
GENERALIZED AUTOREGRESSIVE SCORE MODELS WITH APPLICATIONS
by
Koopman, Siem Jan
,
Lucas, André
,
Creal, Drew
in
Classroom observation
,
Copulas
,
Econometric models
2013
We propose a class of observation-driven time series models referred to as generalized autoregressive score (GAS) models. The mechanism to update the parameters over time is the scaled score of the likelihood function. This new approach provides a unified and consistent framework for introducing time-varying parameters in a wide class of nonlinear models. The GAS model encompasses other well-known models such as the generalized autoregressive conditional heteroskedasticity, autoregressive conditional duration, autoregressive conditional intensity, and Poisson count models with time-varying mean. In addition, our approach can lead to new formulations of observation-driven models. We illustrate our framework by introducing new model specifications for time-varying copula functions and for multi variate point processes with time-vary ing parameters. We study the models in detail and provide simulation and empirical evidence.
Journal Article
Quantile Regression: 40 Years On
2017
Since Quetelet's work in the nineteenth century, social science has iconified the average man, that hypothetical man without qualities who is comfortable with his head in the oven and his feet in a bucket of ice. Conventional statistical methods since Quetelet have sought to estimate the effects of policy treatments for this average man. However, such effects are often quite heterogeneous: Medical treatments may improve life expectancy but also impose serious short-term risks; reducing class sizes may improve the performance of good students but not help weaker ones, or vice versa. Quantile regression methods can help to explore these heterogeneous effects. Some recent developments in quantile regression methods are surveyed in this review.
Journal Article
Realized GARCH: a joint model for returns and realized measures of volatility
by
Hansen, Peter Reinhard
,
Shek, Howard Howan
,
Huang, Zhuo
in
Economic models
,
Forecasting models
,
GARCH models
2012
We introduce a new framework, Realized GARCH, for the joint modeling of returns and realized measures of volatility. A key feature is a measurement equation that relates the realized measure to the conditional variance of returns. The measurement equation facilitates a simple modeling of the dependence between returns and future volatility. Realized GARCH models with a linear or log-linear specification have many attractive features. They are parsimonious, simple to estimate, and imply an ARMA structure for the conditional variance and the realized measure. An empirical application with Dow Jones Industrial Average stocks and an exchange traded index fund shows that a simple Realized GARCH structure leads to substantial improvements in the empirical fit over standard GARCH models that only use daily returns.
Journal Article
MAXIMUM LIKELIHOOD ESTIMATION AND INFERENCE FOR APPROXIMATE FACTOR MODELS OF HIGH DIMENSION
2016
An approximate factor model of high dimension has two key features. First, the idiosyncratic errors are correlated and heteroskedastic over both the cross-section and time dimensions; the correlations and heteroskedasticities are of unknown forms. Second, the number of variables is comparable or even greater than the sample size. Thus, a large number of parameters exist under a high-dimensional approximate factor model. Most widely used approaches to estimation are principal component based. This paper considers the maximum likelihood-based estimation of the model. Consistency, rate of convergence, and limiting distributions are obtained under various identification restrictions. Monte Carlo simulations show that the likelihood method is easy to implement and has good finite sample properties.
Journal Article
HAR Inference: Recommendations for Practice
by
Lewis, Daniel J.
,
Lazarus, Eben
,
Watson, Mark W.
in
Heteroscedasticity- and autocorrelation-robust estimation
,
Long-run variance
,
Time series
2018
The classic papers by Newey and West (1987) and Andrews (1991) spurred a large body of work on how to improve heteroscedasticity- and autocorrelation-robust (HAR) inference in time series regression. This literature finds that using a larger-than-usual truncation parameter to estimate the long-run variance, combined with Kiefer-Vogelsang (2002, 2005) fixed-b critical values, can substantially reduce size distortions, at only a modest cost in (size-adjusted) power. Empirical practice, however, has not kept up. This article therefore draws on the post-Newey West/Andrews literature to make concrete recommendations for HAR inference. We derive truncation parameter rules that choose a point on the size-power tradeoff to minimize a loss function. If Newey-West tests are used, we recommend the truncation parameter rule S = 1.3T
1/2
and (nonstandard) fixed-b critical values. For tests of a single restriction, we find advantages to using the equal-weighted cosine (EWC) test, where the long run variance is estimated by projections onto Type II cosines, using ν = 0.4T
2/3
cosine terms; for this test, fixed-b critical values are, conveniently, t
ν
or F. We assess these rules using first an ARMA/GARCH Monte Carlo design, then a dynamic factor model design estimated using a 207 quarterly U.S. macroeconomic time series.
Journal Article
Identifying Cointegration by Eigenanalysis
by
Robinson, Peter
,
Zhang, Rongmao
,
Yao, Qiwei
in
Cointegration
,
Cointegration analysis
,
Computer simulation
2019
We propose a new and easy-to-use method for identifying cointegrated components of nonstationary time series, consisting of an eigenanalysis for a certain nonnegative definite matrix. Our setting is model-free, and we allow the integer-valued integration orders of the observable series to be unknown, and to possibly differ. Consistency of estimates of the cointegration space and cointegration rank is established both when the dimension of the observable time series is fixed as sample size increases, and when it diverges slowly. The proposed methodology is also extended and justified in a fractional setting. A Monte Carlo study of finite-sample performance, and a small empirical illustration, are reported. Supplementary materials for this article are available online.
Journal Article