Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
999 result(s) for "kernel smoothing"
Sort by:
Mapping Seismic Hazard for Canadian Sites Using Spatially Smoothed Seismicity Model
The estimated seismic hazard based on the delineated seismic source model is used as the basis to assign the seismic design loads in Canadian structural design codes. An alternative for the estimation is based on a spatially smoothed source model. However, a quantification of differences in the Canadian seismic hazard maps (CanSHMs) obtained based on the delineated seismic source model and spatially smoothed model is unavailable. The quantification is valuable to identify epistemic uncertainty in the estimated seismic hazard and the degree of uncertainty in the CanSHMs. In the present study, we developed seismic source models using spatial smoothing and historical earthquake catalogue. We quantified the differences in the estimated Canadian seismic hazard by considering the delineated source model and spatially smoothed source models. For the development of the spatially smoothed seismic source models, we considered spatial kernel smoothing techniques with or without adaptive bandwidth. The results indicate that the use of the delineated seismic source model could lead to under or over-estimation of the seismic hazard as compared to those estimated based on spatially smoothed seismic source models. This suggests that an epistemic uncertainty caused by the seismic source models should be considered to map the seismic hazard.
Non-parametric methods for doubly robust estimation of continuous treatment effects
Continuous treatments (e.g. doses) arise often in practice, but many available causal effect estimators are limited by either requiring parametric models for the effect curve, or by not allowing doubly robust covariate adjustment. We develop a novel kernel smoothing approach that requires only mild smoothness assumptions on the effect curve and still allows for misspecification of either the treatment density or outcome regression. We derive asymptotic properties and give a procedure for data-driven bandwidth selection. The methods are illustrated via simulation and in a study of the effect of nurse staffing on hospital readmissions penalties.
De novo identification of differentially methylated regions in the human genome
Background The identification and characterisation of differentially methylated regions (DMRs) between phenotypes in the human genome is of prime interest in epigenetics. We present a novel method, DMRcate , that fits replicated methylation measurements from the Illumina HM450K BeadChip (or 450K array) spatially across the genome using a Gaussian kernel. DMRcate identifies and ranks the most differentially methylated regions across the genome based on tunable kernel smoothing of the differential methylation (DM) signal. The method is agnostic to both genomic annotation and local change in the direction of the DM signal, removes the bias incurred from irregularly spaced methylation sites, and assigns significance to each DMR called via comparison to a null model. Results We show that, for both simulated and real data, the predictive performance of DMRcate is superior to those of Bumphunter and Probe Lasso , and commensurate with that of comb-p . For the real data, we validate all array-derived DMRs from the candidate methods on a suite of DMRs derived from whole-genome bisulfite sequencing called from the same DNA samples, using two separate phenotype comparisons. Conclusions The agglomeration of genomically localised individual methylation sites into discrete DMRs is currently best served by a combination of DM-signal smoothing and subsequent threshold specification. The findings also suggest the design of the 450K array shows preference for CpG sites that are more likely to be differentially methylated, but its overall coverage does not adequately reflect the depth and complexity of methylation signatures afforded by sequencing. For the convenience of the research community we have created a user-friendly R software package called DMRcate , downloadable from Bioconductor and compatible with existing preprocessing packages, which allows others to apply the same DMR-finding method on 450K array data.
A GENERAL APPROACH FOR CURE MODELS IN SURVIVAL ANALYSIS
In survival analysis it often happens that some subjects under study do not experience the event of interest; they are considered to be “cured.” The population is thus a mixture of two subpopulations, one of cured subjects and one of “susceptible” subjects. We propose a novel approach to estimate a mixture cure model when covariates are present and the lifetime is subject to random right censoring. We work with a parametric model for the cure proportion, while the conditional survival function of the uncured subjects is unspecified. The approach is based on an inversion which allows us to write the survival function as a function of the distribution of the observable variables. This leads to a very general class of models which allows a flexible and rich modeling of the conditional survival function. We show the identifiability of the proposed model as well as the consistency and the asymptotic normality of the model parameters. We also consider in more detail the case where kernel estimators are used for the nonparametric part of the model. The new estimators are compared with the estimators from a Cox mixture cure model via simulations. Finally, we apply the new model on a medical data set.
OPTIMAL MODEL AVERAGING OF VARYING COEFFICIENT MODELS
We consider the problem of model averaging over a set of semiparametric varying coefficient models where the varying coefficients can be functions of continuous and categorical variables. We propose a Mallows model averaging procedure that is capable of delivering model averaging estimators with solid finite-sample performance. Theoretical underpinnings are provided, finite-sample performance is assessed via Monte Carlo simulation, and an illustrative application is presented. The approach is very simple to implement in practice and R code is provided as supplementary material.
Using Satellite Telemetry and Aerial Counts to Estimate Space Use by Grey Seals around the British Isles
1. In the UK, resolving conflicts between the conservation of grey seals, the management of fish stocks and marine exploitation requires knowledge of the seals' use of space. We present a map of grey seal usage around the British Isles based on satellite telemetry data from adult animals and haul-out survey data. 2. Our approach combined modelling and interpolation. To model the seals' association with particular coastal sites (the haul-outs), we divided the population into sub-populations associated with 24 haul-out groups. Haul-out-specific maps of accessibility were used to supervise usage estimation from satellite telemetry. The mean and variance of seal numbers at each haul-out group were obtained from haul-out counts. The aggregate map of usage for the entire population was produced by adding together the haul-out-specific usage maps, weighted by mean number of animals using that haul-out. 3. Seal usage was primarily concentrated (i) off the northern coasts of the British Isles, (ii) closer to the coast than might be expected purely on the basis of accessibility from the haul-outs and (iii) in a limited number of marine hot-spots. 4. Although our results currently represent the best estimate of how grey seals use the marine environment around Britain, they are neither definitive nor equally precise for all haul-outs. Further data collection should focus in the south-west of the British isles and aerial counts should be repeated for all haul-outs. 5. Synthesis and applications. This work provides environmental managers with current estimates of grey seal usage and describes a methodology for maximizing data efficiency. Our results could guide government departments in licensing marine exploitation by the oil industry, in estimating grey seal predation pressure on vulnerable or economically important prey and in delineating marine special areas of conservation (SAC). Our finding that grey seal usage is characterized by a limited number of hot-spots means that the species is particularly suited to localized conservation efforts.
A SIMPLE FOURIER ANALYTIC PROOF OF THE AKT OPTIMAL MATCHING THEOREM
We present a short and elementary proof of the Ajtai–Komlós–Tusnády (AKT) optimal matching theorem in dimension 2 via Fourier analysis and a smoothing argument. The upper bound applies to more general families of samples, including dependent variables, of interest in the study of rates of convergence for empirical measures. Following the recent pde approach by L. Ambrosio, F. Stra and D. Trevisan, we also adapt a simple proof of the lower bound.
multiScaleR: a generalizable approach for multiscale ecological modeling and scale of effect estimation
Context Analyses in landscape ecology often seek to understand how the landscape surrounding field survey locations relates to ecological responses measured at those sites. A central challenge in these studies is defining the spatial scale at which landscape variables matter. While the limitations of standard approaches to estimating this scale are well known, practical alternatives remain limited and often difficult to apply. Objectives Using simulation and the newly developed R package `mulitScaleR`, this paper describes the performance of scale optimization in relation to data type, sample size, effect size, sample independence, raster surface correlation, spatial autocorrelation, and habitat aggregation. I demonstrate how `multiScaleR` is a significant and accessible advancement for estimating scales of effect. Methods and results The package builds upon existing methods that apply kernel weighting functions to landscape variables but is more general and versatile than existing methods. Functions have been optimized for computational speed and efficiency, including parallelization, use of sparse matrices, and C +  + , facilitating efficient analyses of large data sets. Maximum likelihood-based regression frameworks commonly used in landscape ecology, including models from `unmarked`, `spaMM`, and `glmmTMB` can be seamlessly integrated with `multiScaleR`. The package provides a complete workflow for fitting models, conducting model selection, and spatially projecting models. Two critical insights emerge from simulations and analyses with `multiScaleR`: (1) scales of effect can be estimated with high accuracy and precision alongside regression parameters, but (2) achieving reliable estimates requires large sample sizes. Conclusions `multiScaleR` is a purpose-built R package to estimate scales of effect of landscape variables in regression analyses. The accessibility and flexibility of this package make it a powerful new resource in the toolbox of spatial ecologists.
SMOOTH BACKFITTING FOR ERRORS-IN-VARIABLES ADDITIVE MODELS
In this work, we develop a new smooth backfitting method and theory for estimating additive nonparametric regression models when the covariates are contaminated by measurement errors. For this, we devise a new kernel function that suitably deconvolutes the bias due to measurement errors as well as renders a projection interpretation to the resulting estimator in the space of additive functions. The deconvolution property and the projection interpretation are essential for a successful solution of the problem. We prove that the method based on the new kernel weighting scheme achieves the optimal rate of convergence in one-dimensional deconvolution problems when the smoothness of measurement error distribution is less than a threshold value. We find that the speed of convergence is slower than the univariate rate when the smoothness of measurement error distribution is above the threshold, but it is still much faster than the optimal rate in multivariate deconvolution problems. The theory requires a deliberate analysis of the nonnegligible effects of measurement errors being propagated to other additive components through backfitting operation.We present the finite sample performance of the deconvolution smooth backfitting estimators that confirms our theoretical findings.
Time-aware tensor decomposition for sparse tensors
Given a sparse time-evolving tensor, how can we effectively factorize it to accurately discover latent patterns? Tensor decomposition has been extensively utilized for analyzing various multi-dimensional real-world data. However, existing tensor decomposition models have disregarded the temporal property for tensor decomposition while most real-world data are closely related to time. Moreover, they do not address accuracy degradation due to the sparsity of time slices. The essential problems of how to exploit the temporal property for tensor decomposition and consider the sparsity of time slices remain unresolved. In this paper, we propose time-aware tensor decomposition (tatd), an accurate tensor decomposition method for sparse temporal tensors. tatd is designed to exploit time dependency and time-varying sparsity of real-world temporal tensors. We propose a new smoothing regularization with Gaussian kernel for modeling time dependency. Moreover, we improve the performance of tatd by considering time-varying sparsity. We design an alternating optimization scheme suitable for temporal tensor decomposition with our smoothing regularization. Extensive experiments show that tatd provides the state-of-the-art accuracy for decomposing temporal tensors.