Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
1,968 result(s) for "Empirical likelihood"
Sort by:
Jackknife empirical likelihood for the correlation coefficient with additive distortion measurement errors
The correlation coefficient is fundamental in advanced statistical analysis. However, traditional methods of calculating correlation coefficients can be biased due to the existence of confounding variables. Such confounding variables could act in an additive or multiplicative fashion. To study the additive model, previous research has shown residual-based estimation of correlation coefficients. The powerful tool of empirical likelihood (EL) has been used to construct the confidence interval for the correlation coefficient. However, the methods so far only perform well when sample sizes are large. With small sample size situations, the coverage probability of EL, for instance, can be below 90% at confidence level 95%. On the basis of previous research, we propose new methods of interval estimation for the correlation coefficient using jackknife empirical likelihood, mean jackknife empirical likelihood and adjusted jackknife empirical likelihood. For better performance with small sample sizes, we also propose mean adjusted empirical likelihood. The simulation results show the best performance with mean adjusted jackknife empirical likelihood when the sample sizes are as small as 25. Real data analyses are used to illustrate the proposed approach.
PENALIZED GENERALIZED EMPIRICAL LIKELIHOOD WITH A DIVERGING NUMBER OF GENERAL ESTIMATING EQUATIONS FOR CENSORED DATA
This article considers simultaneous variable selection and parameter estimation as well as hypothesis testing in censored survival models where a parametric likelihood is not available. For the problem, we utilize certain growing dimensional general estimating equations and propose a penalized generalized empirical likelihood, where the general estimating equations are constructed based on the semiparametric efficiency bound of estimation with givenmoment conditions. The proposed penalized generalized empirical likelihood estimators enjoy the oracle properties, and the estimator of any fixed dimensional vector of nonzero parameters achieves the semiparametric efficiency bound asymptotically. Furthermore, we show that the penalized generalized empirical likelihood ratio test statistic has an asymptotic central chi-square distribution. The conditions of local and restricted global optimality of weighted penalized generalized empirical likelihood estimators are also discussed. We present a two-layer iterative algorithm for efficient implementation, and investigate its convergence property. The performance of the proposed methods is demonstrated by extensive simulation studies, and a real data example is provided for illustration.
HYPOTHESIS TESTING IN THE PRESENCE OF MULTIPLE SAMPLES UNDER DENSITY RATIO MODELS
This paper presents a hypothesis testing method given independent samples from a number of connected populations. The method is motivated by a forestry project for monitoring change in the strength of lumber. Traditional practice has been built upon nonparametric methods which ignore the fact that these populations are connected. By pooling the information in multiple samples through a density ratio model, the proposed empirical likelihood method leads to more efficient inferences and therefore reduces the cost in applications. The new test has a classical chi-square null limiting distribution. Its power function is obtained under a class of local alternatives. The local power is found increased even when some underlying populations are unrelated to the hypothesis of interest. Simulation studies confirm that this test has better power properties than potential competitors, and is robust to model misspecification. An application example to lumber strength is included.
Transforming the empirical likelihood towards better accuracy
Under-coverage has been a long-standing issue with the empirical likelihood confidence region. Several methods can be used to address this issue, but they all add complexity to the empirical likelihood inference requiring extra computation and/or extra theoretical investigation. The objective of this article is to find a method that does not add complexity. To this end we look for a simple transformation of the empirical likelihood to alleviate the under-coverage. Using several criteria concerning the accuracy, consistency, and preservation of the geometric appeal of the original empirical likelihood we obtain a transformed version of the empirical likelihood that is extremely simple in theory and computation. Its confidence regions are surprisingly accurate, even in small sample and multidimensional situations. It can be easily used to alleviate the under-coverage problem of empirical likelihood confidence regions. Les zones de confiance issues de la vraisemblance empirique souffrent depuis toujours de souscouverture. Plusieurs méthodes peuvent régler ce problème, mais elles nécessitent toutes des calculs additionnels ou une étude théorique approfondie. Les auteurs proposent donc une approche qui n’augmente pas la complexité de la méthode. Ils présentent en effet une transformation simple de la vraisemblance empirique qui corrige la sous-couverture. En se basant sur plusieurs critères de précision, de convergence et de préservation des bonnes caractéristiques géométriques de la méthode originale, les auteurs obtiennent une version transformée de la vraisemblance empirique qui s’avère extrêmement simple tant au point de vue théorique que calculatoire. Les zones de confiance sont particulièrement justes, même pour des échantillons de petite taille et pour des données multivariées. Cette solution permet de régler le problème de souscouverture pour les zones de confiance issues de la vraisemblance empirique.
Nonparametric confidence intervals for generalized Lorenz curve using modified empirical likelihood
The Lorenz curve portrays income distribution inequality. In this article, we develop three modified empirical likelihood (EL) approaches, including adjusted empirical likelihood, transformed empirical likelihood, and transformed adjusted empirical likelihood, to construct confidence intervals for the generalized Lorenz ordinate. We demonstrate that the limiting distribution of the modified EL ratio statistics for the generalized Lorenz ordinate follows scaled Chi-Squared distributions with one degree of freedom. We compare the coverage probabilities and mean lengths of confidence intervals of the proposed methods with the traditional EL method through simulations under various scenarios. Finally, we illustrate the proposed methods using real data to construct confidence intervals.
Ensemble Approaches to Estimating the Population Mean with Missing Response
We propose new ensemble approaches to estimate the population mean for missing response data with fully observed auxiliary variables. We first compress the working models according to their categories through a weighted average, where the weights are proportional to the square of the least-squares coefficients of model refitting. Based on the compressed values, we develop two ensemble frameworks, under which one is to adjust weights in the inverse probability weighting procedure and the other is built upon an additive structure by reformulating the augmented inverse probability weighting function. The asymptotic normality property is established for the proposed estimators through the theory of estimating functions with plugged-in nuisance parameter estimates. Simulation studies show that the new proposals have substantial advantages over existing ones for small sample sizes, and an acquired immune deficiency syndrome data example is used for illustration.
Empirical Likelihood for Generalized Linear Models with Longitudinal Data
Generalized linear models are usually adopted to model the discrete or nonnegative responses. In this paper, empirical likelihood inference for fixed design generalized linear models with longitudinal data is investigated. Under some mild conditions, the consistency and asymptotic normality of the maximum empirical likelihood estimator are established, and the asymptotic χ 2 distribution of the empirical log-likelihood ratio is also obtained. Compared with the existing results, the new conditions are more weak and easy to verify. Some simulations are presented to illustrate these asymptotic properties.
Weighted Empirical Likelihood for Accelerated Life Model with Various Types of Censored Data
In analysis of survival data, the Accelerated Life Model (ALM) is one of the widely used semiparametric models, and we often encounter various types of censored survival data, such as right censored data, doubly censored data, interval censored data, partly interval-censored data, etc. For complicated types of censored data, the studies of statistical inferences on the ALM are very technical and challenging mathematically, thus up to now little work has been done. In this article, we extend the concept of weighted empirical likelihood (WEL) from univariate case to multivariate case, and we apply it to the ALM, which leads to an estimation approach, called weighted maximum likelihood estimator, as well as the WEL based confidence interval for the regression parameter. Our proposed procedures are applicable to various types of censored data under a unified framework, and some simulation results are presented.
Jackknife and Transformed Jackknife Empirical Likelihood Inferences for the Lifetime Performance Index with Missing Data
An important topic in manufacturing industries is assessing the lifetime performance using a taken sample. But many factors may be implied to encounter with missing data. We propose the jackknife empirical likelihood (JEL) and transformed jackknife empirical likelihood (TJEL) methods to construct confidence intervals for the lifetime performance index with missing data. After using hot deck imputation, we apply the JEL and TJEL. Simulation studies are utilized to evaluate the proposed methods and some competitors in terms of coverage probability and average lengths of confidence intervals. A real data set is used to illustrate the proposed JEL and TJEL and the competitor methods.
Empirical Likelihood Semiparametric Regression Analysis for Longitudinal Data
A semiparametric regression model for longitudinal data is considered. The empirical likelihood method is used to estimate the regression coefficients and the baseline function, and to construct confidence regions and intervals. It is proved that the maximum empirical likelihood estimator of the regression coefficients achieves asymptotic efficiency and the estimator of the baseline function attains asymptotic normality when a bias correction is made. Two calibrated empirical likelihood approaches to inference for the baseline function are developed. We propose a groupwise empirical likelihood procedure to handle the inter-series dependence for the longitudinal semiparametric regression model, and employ bias correction to construct the empirical likelihood ratio functions for the parameters of interest. This leads us to prove a nonparametric version of Wilks' theorem. Compared with methods based on normal approximations, the empirical likelihood does not require consistent estimators for the asymptotic variance and bias. A simulation compares the empirical likelihood and normal-based methods in terms of coverage accuracies and average areas/lengths of confidence regions/intervals.