Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Content Type
      Content Type
      Clear All
      Content Type
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Item Type
    • Is Full-Text Available
    • Subject
    • Publisher
    • Source
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
43,982 result(s) for "Diffusion models"
Sort by:
Factor Separation in the Atmosphere : Applications and Future Prospects
\"Modeling atmospheric processes in order to forecast the weather or future climate change is an extremely complex and computationally intensive undertaking. One of the main difficulties is that there are a huge number of factors that need to be taken into account, some of which are still poorly understood. The Factor Separation (FS) method is a computational procedure that helps deal with these nonlinear factors. In recent years many scientists have applied FS methodology to a range of modeling problems, including paleoclimatology, limnology, regional climate change, rainfall analysis, cloud modeling, pollution, crop growth, and other forecasting applications. This book is the first to describe the fundamentals of the method, and to bring together its many applications in the atmospheric sciences. The main audience is researchers and graduate students using the FS method, but it is also of interest to advanced students, researchers, and professionals across the atmospheric sciences\"-- Provided by publisher.
An Introductory Guide to Event Study Models
The event study model is a powerful econometric tool used for the purpose of estimating dynamic treatment effects. One of its most appealing features is that it provides a built-in graphical summary of results, which can reveal rich patterns of behavior. Another value of the picture is the estimated pre-event pseudo-\"effects\", which provide a type of placebo test. In this essay I aim to provide a framework for a shared understanding of these models. There are several (sometimes subtle) decisions and choices faced by users of these models, and I offer guidance for these decisions.
WHY YOU SHOULD NEVER USE THE HODRICK-PRESCOTT FILTER
Here’s why. (a) The Hodrick-Prescott (HP) filter introduces spurious dynamic relations that have no basis in the underlying data-generating process. (b) Filtered values at the end of the sample are very different from those in the middle and are also characterized by spurious dynamics. (c) A statistical formalization of the problem typically produces values for the smoothing parameter vastly at odds with common practice. (d) There is a better alternative. A regression of the variable at date t on the four most recent values as of date t − h achieves all the objectives sought by users of the HP filter with none of its drawbacks.
Constrained Factor Models for High-Dimensional Matrix-Variate Time Series
High-dimensional matrix-variate time series data are becoming widely available in many scientific fields, such as economics, biology, and meteorology. To achieve significant dimension reduction while preserving the intrinsic matrix structure and temporal dynamics in such data, Wang, Liu, and Chen proposed a matrix factor model, that is, shown to be able to provide effective analysis. In this article, we establish a general framework for incorporating domain and prior knowledge in the matrix factor model through linear constraints. The proposed framework is shown to be useful in achieving parsimonious parameterization, facilitating interpretation of the latent matrix factor, and identifying specific factors of interest. Fully utilizing the prior-knowledge-induced constraints results in more efficient and accurate modeling, inference, dimension reduction as well as a clear and better interpretation of the results. Constrained, multi-term, and partially constrained factor models for matrix-variate time series are developed, with efficient estimation procedures and their asymptotic properties. We show that the convergence rates of the constrained factor loading matrices are much faster than those of the conventional matrix factor analysis under many situations. Simulation studies are carried out to demonstrate finite-sample performance of the proposed method and its associated asymptotic properties. We illustrate the proposed model with three applications, where the constrained matrix-factor models outperform their unconstrained counterparts in the power of variance explanation under the out-of-sample 10-fold cross-validation setting. Supplementary materials for this article are available online.
InterGen: Diffusion-Based Multi-human Motion Generation Under Complex Interactions
We have recently seen tremendous progress in diffusion advances for generating realistic human motions. Yet, they largely disregard the multi-human interactions. In this paper, we present InterGen, an effective diffusion-based approach that enables layman users to customize high-quality two-person interaction motions, with only text guidance. We first contribute a multimodal dataset, named InterHuman. It consists of about 107 M frames for diverse two-person interactions, with accurate skeletal motions and 23,337 natural language descriptions. For the algorithm side, we carefully tailor the motion diffusion model to our two-person interaction setting. To handle the symmetry of human identities during interactions, we propose two cooperative transformer-based denoisers that explicitly share weights, with a mutual attention mechanism to further connect the two denoising processes. Then, we propose a novel representation for motion input in our interaction diffusion model, which explicitly formulates the global relations between the two performers in the world frame. We further introduce two novel regularization terms to encode spatial relations, equipped with a corresponding damping scheme during the training of our interaction diffusion model. Extensive experiments validate the effectiveness of InterGen (https://tr3e.github.io/intergen-page/). Notably, it can generate more diverse and compelling two-person motions than previous methods and enables various downstream applications for human interactions.
GENERALIZED AUTOREGRESSIVE SCORE MODELS WITH APPLICATIONS
We propose a class of observation-driven time series models referred to as generalized autoregressive score (GAS) models. The mechanism to update the parameters over time is the scaled score of the likelihood function. This new approach provides a unified and consistent framework for introducing time-varying parameters in a wide class of nonlinear models. The GAS model encompasses other well-known models such as the generalized autoregressive conditional heteroskedasticity, autoregressive conditional duration, autoregressive conditional intensity, and Poisson count models with time-varying mean. In addition, our approach can lead to new formulations of observation-driven models. We illustrate our framework by introducing new model specifications for time-varying copula functions and for multi variate point processes with time-vary ing parameters. We study the models in detail and provide simulation and empirical evidence.
Quantile Regression: 40 Years On
Since Quetelet's work in the nineteenth century, social science has iconified the average man, that hypothetical man without qualities who is comfortable with his head in the oven and his feet in a bucket of ice. Conventional statistical methods since Quetelet have sought to estimate the effects of policy treatments for this average man. However, such effects are often quite heterogeneous: Medical treatments may improve life expectancy but also impose serious short-term risks; reducing class sizes may improve the performance of good students but not help weaker ones, or vice versa. Quantile regression methods can help to explore these heterogeneous effects. Some recent developments in quantile regression methods are surveyed in this review.
Realized GARCH: a joint model for returns and realized measures of volatility
We introduce a new framework, Realized GARCH, for the joint modeling of returns and realized measures of volatility. A key feature is a measurement equation that relates the realized measure to the conditional variance of returns. The measurement equation facilitates a simple modeling of the dependence between returns and future volatility. Realized GARCH models with a linear or log-linear specification have many attractive features. They are parsimonious, simple to estimate, and imply an ARMA structure for the conditional variance and the realized measure. An empirical application with Dow Jones Industrial Average stocks and an exchange traded index fund shows that a simple Realized GARCH structure leads to substantial improvements in the empirical fit over standard GARCH models that only use daily returns.
Quantile Co-Movement in Financial Markets: A Panel Quantile Model With Unobserved Heterogeneity
This article introduces a new procedure for analyzing the quantile co-movement of a large number of financial time series based on a large-scale panel data model with factor structures. The proposed method attempts to capture the unobservable heterogeneity of each of the financial time series based on sensitivity to explanatory variables and to the unobservable factor structure. In our model, the dimension of the common factor structure varies across quantiles, and the explanatory variables is allowed to depend on the factor structure. The proposed method allows for both cross-sectional and serial dependence, and heteroscedasticity, which are common in financial markets. We propose new estimation procedures for both frequentist and Bayesian frameworks. Consistency and asymptotic normality of the proposed estimator are established. We also propose a new model selection criterion for determining the number of common factors together with theoretical support. We apply the method to analyze the returns for over 6000 international stocks from over 60 countries during the subprime crisis, European sovereign debt crisis, and subsequent period. The empirical analysis indicates that the common factor structure varies across quantiles. We find that the common factors for the quantiles and the common factors for the mean are different. Supplementary materials for this article are available online.
MAXIMUM LIKELIHOOD ESTIMATION AND INFERENCE FOR APPROXIMATE FACTOR MODELS OF HIGH DIMENSION
An approximate factor model of high dimension has two key features. First, the idiosyncratic errors are correlated and heteroskedastic over both the cross-section and time dimensions; the correlations and heteroskedasticities are of unknown forms. Second, the number of variables is comparable or even greater than the sample size. Thus, a large number of parameters exist under a high-dimensional approximate factor model. Most widely used approaches to estimation are principal component based. This paper considers the maximum likelihood-based estimation of the model. Consistency, rate of convergence, and limiting distributions are obtained under various identification restrictions. Monte Carlo simulations show that the likelihood method is easy to implement and has good finite sample properties.