Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
95
result(s) for
"Hazard ratio estimation"
Sort by:
Goodness-of-fit two-phase sampling designs for time-to-event outcomes: a simulation study based on New York University Women’s Health Study for breast cancer
2023
Background
Sub-cohort sampling designs such as a case-cohort study play a key role in studying biomarker-disease associations due to their cost effectiveness. Time-to-event outcome is often the focus in cohort studies, and the research goal is to assess the association between the event risk and risk factors. In this paper, we propose a novel goodness-of-fit two-phase sampling design for time-to-event outcomes when some covariates (e.g., biomarkers) can only be measured on a subgroup of study subjects.
Methods
Assuming that an external model, which can be the well-established risk models such as the Gail model for breast cancer, Gleason score for prostate cancer, and Framingham risk models for heart diseases, or built from preliminary data, is available to relate the outcome and complete covariates, we propose to oversample subjects with worse goodness-of-fit (GOF) based on an external survival model and time-to-event. With the cases and controls sampled using the GOF two-phase design, the inverse sampling probability weighting method is used to estimate the log hazard ratio of both incomplete and complete covariates. We conducted extensive simulations to evaluate the efficiency gain of our proposed GOF two-phase sampling designs over case-cohort study designs.
Results
Through extensive simulations based on a dataset from the New York University Women’s Health Study, we showed that the proposed GOF two-phase sampling designs were unbiased and generally had higher efficiency compared to the standard case-cohort study designs.
Conclusion
In cohort studies with rare outcomes, an important design question is how to select informative subjects to reduce sampling costs while maintaining statistical efficiency. Our proposed goodness-of-fit two-phase design provides efficient alternatives to standard case-cohort designs for assessing the association between time-to-event outcome and risk factors. This method is conveniently implemented in standard software.
Journal Article
Testing and confidence intervals for high dimensional proportional hazards models
by
Fang, Ethan X.
,
Ning, Yang
,
Liu, Han
in
Asymptotic methods
,
Censored data
,
Computer simulation
2017
The paper considers the problem of hypothesis testing and confidence intervals in high dimensional proportional hazards models. Motivated by a geometric projection principle, we propose a unified likelihood ratio inferential framework, including score, Wald and partial likelihood ratio statistics for hypothesis testing. Without assuming model selection consistency, we derive the asymptotic distributions of these test statistics, establish their semiparametric optimality and conduct power analysis under Pitman alternatives. We also develop new procedures to construct pointwise confidence intervals for the baseline hazard function and conditional hazard function. Simulation studies show that all tests proposed perform well in controlling type I errors. Moreover, the partial likelihood ratio test is empirically more powerful than the other tests. The methods proposed are illustrated by an example of a gene expression data set.
Journal Article
Multi-Parameter Regression Survival Modeling: An Alternative to Proportional Hazards
2017
It is standard practice for covariates to enter a parametric model through a single distributional parameter of interest, for example, the scale parameter in many standard survival models. Indeed, the well-known proportional hazards model is of this kind. In this article, we discuss a more general approach whereby covariates enter the model through more than one distributional parameter simultaneously (e.g., scale and shape parameters). We refer to this practice as \"multi-parameter regression\" (MPR) modeling and explore its use in a survival analysis context. We find that multi-parameter regression leads to more flexible models which can offer greater insight into the underlying data generating process. To illustrate the concept, we consider the two-parameter Weibull model which leads to time-dependent hazard ratios, thus relaxing the typical proportional hazards assumption and motivating a new test of proportionality. A novel variable selection strategy is introduced for such multi-parameter regression models. It accounts for the correlation arising between the estimated regression coefficients in two or more linear predictors—a feature which has not been considered by other authors in similar settings. The methods discussed have been implemented in the mpr package in R.
Journal Article
Seismic hazard assessment for mainland China based on spatially smoothed seismicity
2020
Probabilistic seismic hazard assessment for mainland China is carried out by using spatially smoothed seismic source models obtained using the historical earthquake catalogue and two kernel smoothing techniques. The first one smooths the cumulative event count and the second one smooths the earthquake occurrence rate. For the analysis, the completeness of the catalogue is assessed and the k-mean seismic analysis results are used to aid the assignment of the regions. It is shown that the estimated parameters for the Gutenberg-Richter relation by using the maximum likelihood–based approach are consistent with those estimated by using the least-squares based approach by considering unequal observation periods. The developed seismic hazard maps indicate that the use of the smoothed source model obtained by smoothing the cumulative event count leads to spatially smeared seismic hazard. The use of the smoothed source model obtained based on smoothing the occurrence rate results in the seismic hazard map to have more spatial details. The seismic hazard assessment results indicate that, on average, the ratio of the 2475-year to 475-year return period values of PGA is about 2 and the ratio of the 2475-year to 50-year return period values of PGA is about 8. These ratios are associated with large scatters. A comparison of the normalized UHS and the Chinese design code recommended standardized design spectrum indicates that the latter is conservative as compared to the former for vibration periods outside of 0.1 to 0.2 s, and that the shape of the latter differs from that of former.
Journal Article
The impact of bias due to exponentiation in the estimation of hazard, risk, and odds ratios: an empirical investigation from 1,495,059 effect sizes from MEDLINE/PubMed abstracts
2025
Background
Parameter estimation using regression methods plays a vital role in medical research. Often a non-linear transformation of a regression parameter is preferred for its more intuitive interpretation. Important examples in medical research are odds ratios, risk ratios, and hazard ratios, which are obtained by exponentiating the estimated regression coefficients of the logit link binomial generalized linear model, log link Poisson generalized linear model or Cox proportional hazards model, respectively. A lot of attention has been devoted to studying and removing the bias of the estimators on the scale of the regression, but the bias of the transformed parameters is rarely addressed.
Methods
Two approaches for reducing the bias due to the exponentiation are reviewed and applied to odds ratios, risk ratios, and hazard ratios reported in the abstracts published in the MEDLINE subset of English-language PubMed records.
Results
We show that correcting for the bias due to the exponentiation may yield substantially different estimates, potentially resulting in a large shrinkage of the reported effect size estimates.
Conclusion
Given the wide availability of methods to reduce the bias on the scale of regression, we encourage their routine use to improve estimation. In situations where the consequences of biased estimation are larger at the exponentiated scale than at the scale of regression, as for example in some policy and planning settings, we additionally encourage the removal of the bias due to the exponentiation.
Journal Article
A Generalized Continuous Bernoulli Distribution: Statistical Properties, Methods of Estimation, and Applications
by
Ehiwario, Jacob C.
,
Opone, Festus C.
,
Igabari, John N.
in
Asymptotic properties
,
Datasets
,
Entropy
2025
This paper introduces a generalized continuous Bernoulli distribution based on the Marshall–Olkin technique for generating new distributions. We refer to the proposed distribution as the Marshall–Olkin continuous Bernoulli (MOCB) distribution. Useful statistical properties and mathematical expressions of the seven methods of parameter estimation for the unknown parameters of the MOCB distribution, including the maximum likelihood, ordinary least squares, weighted least squares, maximum product spacing, percentile, Anderson–Darling, and Cramér–von Mises estimators, are derived. The asymptotic behavior of the unknown parameter estimates of the MOCB distribution using these methods revealed that the maximum product spacing method provides a better estimate of the parameters of the MOCB distribution. Furthermore, the applicability of the MOCB distribution in practical data fitting is illustrated using two real‐life datasets. Results obtained from the fitting of the two datasets suggest that the proposed generalized continuous Bernoulli distribution based on the Marshall–Olkin scheme offers a better fit than those based on the power and transmuted transformations. In particular, the likelihood ratio test (LRT) results for the two datasets indicate a significant improvement in the continuous Bernoulli distribution via the Marshall–Olkin scheme.
Journal Article
Sensitivity of flood loss estimates to building representation and flow depth attribution methods in micro-scale flood modelling
by
Bermúdez, María
,
Zischg, Andreas Paul
in
Buildings
,
Computer applications
,
Computer simulation
2018
Thanks to modelling advances and the increase in computational resources in recent years, it is now feasible to perform 2-D urban flood simulations at very high spatial resolutions and to conduct flood risk assessments at the scale of single buildings. In this study, we explore the sensitivity of flood loss estimates obtained in such micro-scale analyses to the spatial representation of the buildings in the 2D flood inundation model and to the hazard attribution methods in the flood loss model. The results show that building representation has a limited effect on the exposure values (i.e. the number of elements at risk), but can have a significant impact on the hazard values attributed to the buildings. On the other hand, the two methods for hazard attribution tested in this work result in remarkably different flood loss estimates. The sensitivity of the predicted flood losses to the attribution method is comparable to the one associated with the vulnerability curve. The findings highlight the need for incorporating these sources of uncertainty into micro-scale flood risk prediction methodologies.
Journal Article
Regional landslide parameter constraint and uncertainty quantification on the Qinghai-Tibet Plateau using a geographically-enabled bayesian framework and paleo-landslide data
by
Yang, Xiaopeng
,
Feng, Xuguang
,
Lu, Jingyi
in
Adaptive algorithms
,
Bayesian analysis
,
Bayesian theory
2025
The Qinghai-Tibet Plateau (QTP) is highly susceptible to large-scale landslides. Reliable regional hazard assessment is challenged by data scarcity and parameter uncertainty. This study addresses these issues by developing a 'Geographically-Enabled Bayesian Framework' and applying it to a unique dataset of 42 paleo-landslides on the QTP. The framework synergistically integrates geomatics (SRTM DEM-based topography reconstruction), physics-based simulation (Massflow), and advanced statistical modeling. Using Gaussian Process Regression surrogates and an adaptive MCMC algorithm, the framework efficiently constrains two key mobility parameters-the basal friction coefficient (μ) and pore water pressure ratio (r
u
). The framework's performance is demonstrated by a marked reduction in parameter uncertainty (average standard deviation decrease: 24.5% for μ and 36.7% for r
u
), providing robust, data-driven constraints. Its robustness is further substantiated through a preliminary cross-regional validation, confirming generalizability. This study also discusses how geospatial data uncertainty influences results and how constrained parameters can inform risk mitigation. Ultimately, this work delivers a validated, scalable methodology and crucial parameter estimates, establishing a more rigorous foundation for landslide hazard and risk assessment on the QTP.
Journal Article
On the targets of inference with multivariate failure time data
2022
There are several different topics that can be addressed with multivariate failure time regression data. Data analysis methods are needed that are suited to each such topic. Specifically, marginal hazard rate models are well suited to the analysis of exposures or treatments in relation to individual failure time outcomes, when failure time dependencies are themselves of little or no interest. On the other hand semiparametric copula models are well suited to analyses where interest focuses primarily on the magnitude of dependencies between failure times. These models overlap with frailty models, that seem best suited to exploring the details of failure time clustering. Recently proposed multivariate marginal hazard methods, on the other hand, are well suited to the exploration of exposures or treatments in relation to single, pairwise, and higher dimensional hazard rates. Here these methods will be briefly described, and the final method will be illustrated using the Women’s Health Initiative hormone therapy trial data.
Journal Article
Land subsidence prediction in coal mining using machine learning models and optimization techniques
by
Noorian-Bidgoli, Majid
,
Jahanmiri, Shirin
in
Algorithms
,
Aquatic Pollution
,
Atmospheric Protection/Air Quality Control/Air Pollution
2024
Land surface subsidence is an environmental hazard resulting from the extraction of underground resources. In underground mining, when mineral materials are extracted deep within the ground, the emptying or caving of the mined spaces leads to vertical displacement of the ground, known as subsidence. This subsidence can extend to the surface as trough subsidence, as the movement and deformation of the hanging-wall rocks of the mining stope propagate upwards. Accurately predicting subsidence is crucial for estimating damage and protecting surface buildings and structures in mining areas. Therefore, developing a model that considers all relevant parameters for subsidence estimation is essential. In this article, we discuss the prediction of land subsidence caused by the caving of a stop’s roof, focusing on coal mining using the longwall method. The main aim of this research is to improve the accuracy of prediction models of land subsidence due to mining. For this purpose, we consider a total of 11 parameters related to coal mining, including mining thickness and depth (related to the deposit), as well as density, cohesion, internal friction angle, elasticity modulus, bulk modulus, shear modulus, Poisson’s ratio, uniaxial compressive strength, and tensile strength (related to the overburden). We utilize information collected from 14 coal mines regarding mining and subsidence to achieve this. We then explore the prediction of subsidence caused by mining using the gene expression programming (GEP) algorithm, optimized through a combination of the artificial bee colony (ABC) and ant lion optimizer (ALO) algorithms. Modeling results demonstrate that combining the GEP algorithm with optimization based on the ABC algorithm yields the best subsidence prediction, achieving a correlation coefficient of 0.96. Furthermore, sensitivity analysis reveals that mining depth and density have the greatest and least effects, respectively, on land surface subsidence resulting from coal mining using the longwall method.
Journal Article