Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Content Type
      Content Type
      Clear All
      Content Type
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Item Type
    • Is Full-Text Available
    • Subject
    • Publisher
    • Source
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
28 result(s) for "Koul, H. L. (Hira L.)"
Sort by:
Martingale Transforms Goodness-of-Fit Tests in Regression Models
This paper discusses two goodness-of-fit testing problems. The first problem pertains to fitting an error distribution to an assumed nonlinear parametric regression model, while the second pertains to fitting a parametric regression model when the error distribution is unknown. For the first problem the paper contains tests based on a certain martingale type transform of residual empirical processes. The advantage of this transform is that the corresponding tests are asymptotically distribution free. For the second problem the proposed asymptotically distribution free tests are based on innovation martingale transforms. A Monte Carlo study shows that the simulated level of the proposed tests is close to the asymptotic level for moderate sample sizes.
Nonparametric Model Checks for Time Series
This paper studies a class of tests useful for testing the goodness-of-fit of an autoregressive model. These tests are based on a class of empirical processes marked by certain residuals. The paper first gives their large sample behavior under null hypotheses. Then a martingale transformation of the underlying process is given that makes tests based on it asymptotically distribution free. Consistency of these tests is also discussed briefly.
Frontiers In Statistics
During the last two decades, many areas of statistical inference have experienced phenomenal growth. This book presents a timely analysis and overview of some of these new developments and a contemporary outlook on the various frontiers of statistics.Eminent leaders in the field have contributed 16 review articles and 6 research articles covering areas including semi-parametric models, data analytical nonparametric methods, statistical learning, network tomography, longitudinal data analysis, financial econometrics, time series, bootstrap and other re-sampling methodologies, statistical computing, generalized nonlinear regression and mixed effects models, martingale transform tests for model diagnostics, robust multivariate analysis, single index models and wavelets.This volume is dedicated to Prof. Peter J Bickel in honor of his 65th birthday. The first article of this volume summarizes some of Prof. Bickel's distinguished contributions.
Efficient Estimation in Nonlinear Autoregressive Time-Series Models
This paper discusses efficient estimation for a class of nonlinear time-series models with unknown error densities. It establishes local asymptotic normality in this semi-parametric setting. This is then used to describe efficient estimates and to discuss the question of adaptation. Stein's necessary condition for adaptive estimation is satisfied if the error densities are symmetric, but is also satisfied in some models with asymmetric error densities. The paper gives several methods of constructing efficient estimates. These results are then applied to construct efficient estimators in SETAR(2; 1, 1), EXPAR(1) and ARMA(1, 1) models. We observe that adaptation is not possible in the SETAR(2; 1, 1) model with asymmetric errors while the efficient estimators in the ARMA(1, 1) model are adaptive even for asymmetric error densities. Section 8 contains a result that is useful in verifying the continuity of the stationary density with respect to the underlying parameters.
REGRESSION MODEL FITTING WITH A LONG MEMORY COVARIATE PROCESS
This paper proposes some tests for fitting a regression model with a long memory covariate process and with errors that form either a martingale difference sequence or a long memory moving average process, independent of the covariate. The tests are based on a partial sum process of the residuals from the fitted regression. The asymptotic null distribution of this process is discussed in some detail under each set of these assumptions. The proposed tests are shown to have known asymptotic null distributions in the case of martingale difference errors and also in the case of fitting a polynomial of a known degree through the origin when the errors have long memory. The theory is then illustrated with some examples based on the forward premium anomaly where a squared interest rate differential proxies a time dependent risk premium. The paper also shows that the proposed test statistic converges weakly to nonstandard distributions in some cases.The authors gratefully acknowledge the helpful comments of the co-editor Don Andrews and two anonymous referees. The research of the first two authors was partly supported by NSF grant DMS 00-71619.
Asymptotics of Some Estimators and Sequential Residual Empiricals in Nonlinear Time Series
This paper establishes the asymptotic uniform linearity of M- and R-scores in a family of nonlinear time series and regression models. It also gives an asymptotic expansion of the standardized sequential residual empirical process in these models. These results are, in turn, used to obtain the asymptotic normality of certain classes of M-, R- and minimum distance estimators of the underlying parameters. The classes of estimators considered include analogs of Hodges-Lehmann, Huber and LAD (least absolute deviation) estimators. Some applications to the change point and testing of the goodness-of-fit problems in threshold and amplitude-dependent exponential autoregression models are also given. The paper thus offers a unified functional approach to some aspects of robust inference for a large class of nonlinear time series models.
Asymptotic Expansion of M-Estimators with Long-Memory Errors
This paper obtains a higher-order asymptotic expansion of a class of M-estimators of the one-sample location parameter when the errors form a long-memory moving average. A suitably standardized difference between an M-estimator and the sample mean is shown to have a limiting distribution. The nature of the limiting distribution depends on the range of the dependence parameter θ. If, for example,$1/3 < \\theta < 1$, then a suitably standardized difference between the sample median and the sample mean converges weakly to a normal distribution provided the common error distribution is symmetric. If$0 < \\theta < 1/3$, then the corresponding limiting distribution is nonnormal. This paper thus goes beyond that of Beran who observed, in the case of long-memory Gaussian errors, that M-estimators Tnof the one-sample location parameter are asymptotically equivalent to the sample mean in the sense that$\\operatorname{Var}(T_n)/\\operatorname{Var}(\\bar{X}_n) \\rightarrow 1$and$T_n = \\bar{X}_n + O_p(\\sqrt{\\operatorname{Var}(\\bar{X}_n)})$.
Frontiers in statistics
During the last two decades, many areas of statistical inference have experienced phenomenal growth. This book presents a timely analysis and overview of some of these new developments and a contemporary outlook on the various frontiers of statistics. Eminent leaders in the field have contributed 16 review articles and 6 research articles covering areas including semi-parametric models, data analytical nonparametric methods, statistical learning, network tomography, longitudinal data analysis, financial econometrics, time series, bootstrap and other re-sampling methodologies, statistical computing, generalized nonlinear regression and mixed effects models, martingale transform tests for model diagnostics, robust multivariate analysis, single index models and wavelets.
Adaptive Estimation in a Random Coefficient Autoregressive Model
This paper proves the local asymptotic normality of a stationary and ergodic first order random coefficient autoregressive model in a semiparametric setting. This result is used to show that Stein's necessary condition for adaptive estimation of the mean of the random coefficient is satisfied if the distributions of the innovations and the errors in the random coefficients are symmetric around zero. Under these symmetry assumptions, a locally asymptotically minimax adaptive estimator of the mean of the random coefficient is constructed. The paper also proves the asymptotic normality of generalized M-estimators of the parameter of interest. These estimators are used as preliminary estimators in the above construction.