Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
37 result(s) for "ABC intervals"
Sort by:
Estimation and Accuracy After Model Selection
Classical statistical theory ignores model selection in assessing estimation accuracy. Here we consider bootstrap methods for computing standard errors and confidence intervals that take model selection into account. The methodology involves bagging, also known as bootstrap smoothing, to tame the erratic discontinuities of selection-based estimators. A useful new formula for the accuracy of bagging then provides standard errors for the smoothed estimators. Two examples, nonparametric and parametric, are carried through in detail: a regression model where the choice of degree (linear, quadratic, cubic, …) is determined by the C ₚ criterion and a Lasso-based estimation problem.
An empirical adjustment to the likelihood ratio statistic
Consider a model parameterised by a scalar parameter of interest ψ and a nuisance parameter λ. Inference about ψ may be based on the signed square root of the likelihood ratio statistic, R. The statistic R is asymptotically distributed according to a standard normal distribution, with error O(n-½). To reduce the error of this normal approximation, several modifications to R have been proposed such as Barndorff-Nielsen's modified directed likelihood statistic, R*. In this paper, an approximation to R* is proposed that can be calculated numerically for a wide range of models. This approximation is shown to agree with R* with error of order Op(n-1). The results are illustrated on several examples.
Bayes and likelihood calculations from confidence intervals
SUMMARY Recently there has been considerable progress on setting good approximate confidence intervals for a single parameter θ in a multi-parameter family. Here we use these frequentist results as a convenient device for making Bayes, empirical Bayes and likelihood inferences about θ. A simple formula is given that produces an approximate likelihood function Lx†(θ) for θ, with all nuisance parameters eliminated, based on any system of approximate confidence intervals. The statistician can then modify Lx†(θ) with Bayes or empirical Bayes information for θ, without worrying about nuisance parameters. The method is developed for multiparameter exponential families, where there exists a simple and accurate system of approximate confidence intervals for any smoothly defined parameter. The approximate likelihood Lx†(θ) based on this system requires only a few times as much computation as the maximum likelihood estimate θ and its estimated standard error σ. The formula for Lx†(θ) is justified in terms of high-order adjusted likelihoods and also the Jeffreys-Welch & Peers theory of uninformative priors. Several examples are given.
A NOTE ON POWERFUL NUMBERS IN SHORT INTERVALS
We investigate uniform upper bounds for the number of powerful numbers in short intervals $(x, x + y]$ . We obtain unconditional upper bounds $O({y}/{\\log y})$ and $O(\\kern1.3pt y^{11/12})$ for all powerful numbers and $y^{1/2}$ -smooth powerful numbers, respectively. Conditional on the $abc$ -conjecture, we prove the bound $O({y}/{\\log ^{1+\\epsilon } y})$ for squarefull numbers and the bound $O(\\kern1.3pt y^{(2 + \\epsilon )/k})$ for k-full numbers when $k \\ge 3$ . These bounds are related to Roth’s theorem on arithmetic progressions and the conjecture on the nonexistence of three consecutive squarefull numbers.
Multidimensional separation and analysis of alpha-1-acid glycoprotein N-glycopeptides using high-field asymmetric waveform ion mobility spectrometry (FAIMS) and nano-liquid chromatography tandem mass spectrometry
Bottom-up nLC-MS/MS-based glycoprotein mass spectrometry workflows rely on the generation of a mixture of non-glycosylated and glycosylated peptides via proteolysis of glycoproteins. Such methods are challenged by suppression of hydrophilic glycopeptide ions by more abundant, hydrophobic, and readily ionizable non-glycosylated peptides. Commercially available high-field asymmetric waveform ion mobility spectrometry (FAIMS) devices have recently been introduced and present a potential benefit for glycoproteomic workflows by enabling orthogonal separation of non-glycosylated peptides and glycopeptides following chromatographic separation, and prior to MS/MS analysis. However, knowledge is lacking regarding optimal FAIMS conditions for glycopeptide analyses. Here, we document optimal FAIMS compensation voltages for the transmission and analysis of human alpha-1-acid glycoprotein (AGP) tryptic N -glycopeptide ions. Further, we evaluate the effect of FAIMS on AGP glycopeptide assignment confidence by comparing the number of assigned glycopeptides at different confidence levels using a standard nLC-MS/MS method or an otherwise identical method employing FAIMS. Optimized methods will potentiate glycoproteomic analyses by increasing the number of unique glycopeptide identifications and the confidence of glycopeptide assignments. Data are available via ProteomeXchange with identifier PXD036667. Graphical Abstract Analysis of alpha-1-acid glycoprotein (AGP) tryptic digests via nLC-FAIMS-MS/MS ( top ) led to the establishment of ideal FAIMS voltages for the analysis of AGP N -glycopeptides ( bottom ), suggesting that FAIMS can improve the depth of glycoproteome characterization. Pairs of CV magnitudes are shown along the x -axis
High-density lipoprotein sensor based on molecularly imprinted polymer
Decreased blood level of high-density lipoprotein (HDL) is one of the essential criteria in diagnosing metabolic syndrome associated with the development of atherosclerosis and coronary heart disease. Herein, we report the synthesis of a molecularly imprinted polymer (MIP) that selectively binds HDL, namely, HDL-MIP, and thus serves as an artificial, biomimetic sensor layer. The optimized polymer contains methacrylic acid and N-vinylpyrrolidone in the ratio of 2:3, cross-linked with ethylene glycol dimethacrylate. On 10 MHz dual electrode quartz crystal microbalances (QCM), such HDL-MIP revealed dynamic detection range toward HDL standards in the clinically relevant ranges of 2–250 mg/dL HDL cholesterol (HDL-C) in 10 mM phosphate-buffered saline (PBS, pH = 7.4) without significant interference: low-density lipoprotein (LDL) yields 5% of the HDL signal, and both very-low-density lipoprotein (VLDL) and human serum albumin (HSA) yield 0%. The sensor reveals recovery rates between 94 and 104% at 95% confidence interval with precision of 2.3–7.7% and shows appreciable correlation (R2 = 0.97) with enzymatic colorimetric assay, the standard in clinical tests. In contrast to the latter, it achieves rapid results (10 min) during one-step analysis without the need for sample preparation.
A New Model for Stock Management in Order to Rationalize Costs: ABC-FUCOM-Interval Rough CoCoSo Model
Cost rationalization has become imperative in every economic system in order to create adequate foundations for its efficient and sustainable management. Competitiveness in the global market is extremely high and it is challenging to manage business and logistics systems, especially with regards to financial parameters. It is necessary to rationalize costs in all activities and processes. The presence of inventories is inevitability in every logistics system, and it tends to create adequate and symmetrical policies for their efficient and sustainable management. In order to be able to do this, it is necessary to determine which products represent the largest percentage share in the value of procurement, and which are the most represented quantitatively. For this purpose, ABC analysis, which classifies products into three categories, is applied taking into account different constraints. The aim of this paper is to form a new model that involves the integration of ABC analysis, the Full Consistency Method (FUCOM), and a novel Interval Rough Combined Compromise Solution (CoCoSo) for stock management in the storage system. A new IRN Dombi weighted geometric averaging (IRNDWGA) operator is developed to aggregate the initial decision matrix. After grouping the products into three categories A, B and C, it is necessary to identify appropriate suppliers for each category in order to rationalize procurement costs. Financial, logistical, and quality parameters are taken into account. The FUCOM method has been used to determine the significance of these parameters. A new Interval CoCoSo approach is developed to determine the optimal suppliers for each product group. The results obtained have been modeled throughout a multi-phase sensitivity analysis.
Pre-Endoscopic Scores Predicting Low-Risk Patients with Upper Gastrointestinal Bleeding: A Systematic Review and Meta-Analysis
Background: Several risk scores have attempted to risk stratify patients with acute upper gastrointestinal bleeding (UGIB) who are at a lower risk of requiring hospital-based interventions or negative outcomes including death. This systematic review and meta-analysis aimed to compare predictive abilities of pre-endoscopic scores in prognosticating the absence of adverse events in patients with UGIB. Methods: We searched MEDLINE, EMBASE, Central, and ISI Web of knowledge from inception to February 2023. All fully published studies assessing a pre-endoscopic score in patients with UGIB were included. The primary outcome was a composite score for the need of a hospital-based intervention (endoscopic therapy, surgery, angiography, or blood transfusion). Secondary outcomes included: mortality, rebleeding, or the individual endpoints of the composite outcome. Both proportional and comparative analyses were performed. Results: Thirty-eight studies were included from 2153 citations, (n = 36,215 patients). Few patients with a low Glasgow-Blatchford score (GBS) cutoff (0, ≤1 and ≤2) required hospital-based interventions (0.02 (0.01, 0.05), 0.04 (0.02, 0.09) and 0.03 (0.02, 0.07), respectively). The proportions of patients with clinical Rockall (CRS = 0) and ABC (≤3) scores requiring hospital-based intervention were 0.19 (0.15, 0.24) and 0.69 (0.62, 0.75), respectively. GBS (cutoffs 0, ≤1 and ≤2), CRS (cutoffs 0, ≤1 and ≤2), AIMS65 (cutoffs 0 and ≤1) and ABC (cutoffs ≤1 and ≤3) scores all were associated with few patients (0.01–0.04) dying. The proportion of patients suffering other secondary outcomes varied between scoring systems but, in general, was lowest for the GBS. GBS (using cutoffs 0, ≤1 and ≤2) showed excellent discriminative ability in predicting the need for hospital-based interventions (OR 0.02, (0.00, 0.16), 0.00 (0.00, 0.02) and 0.01 (0.00, 0.01), respectively). A CRS cutoff of 0 was less discriminative. For the other secondary outcomes, discriminative abilities varied between scores but, in general, the GBS (using cutoffs up to 2) was clinically useful for most outcomes. Conclusions: A GBS cut-off of one or less prognosticated low-risk patients the best. Expanding the GBS cut-off to 2 maintains prognostic accuracy while allowing more patients to be managed safely as outpatients. The evidence is limited by the number, homogeneity, quality, and generalizability of available data and subjectivity of deciding on clinical impact. Additional, comparative and, ideally, interventional studies are needed.
Multi-isotope calibration for inductively coupled plasma mass spectrometry
Multi-isotope calibration (MICal) is a novel approach to calibration for inductively coupled plasma mass spectrometry (ICP-MS). In MICal, only two calibration solutions are required: solution A, composed of 50% v v−1 of sample and 50% v v−1 of a standard solution containing the analytes, and solution B, composed of 50% v v−1 of sample and 50% v v−1 of a blank solution. MICal is based on monitoring the signal intensities of several isotopes of the same analyte in solutions A and B. By plotting the analytical signals from solution A in the x-axis, and from solution B in the y-axis, the analyte concentration in the sample is calculated using the slope of that graph and the concentration of the reference standard added to solution A. As both solutions contain the same amount of sample, matrix-matching is easily achieved. In this proof-of-concept study, MICal was applied to the determination of Ba, Cd, Se, Sn, and Zn in seven certified reference materials with different matrices (e.g., plant materials, flours, and water). In most cases, MICal results presented no statistical difference from the certified values at a 95% confidence level. The new strategy was also compared with traditional calibration methods such as external calibration, internal standardization and standard additions, and recoveries were generally better for MICal. This is a simple, accurate, and fast alternative method for matrix-matching calibration in ICP-MS.Graphical abstractMulti-isotope calibration: fast and innovative matrix-matching calibration for ICP-MS.
Bootstrap Confidence Intervals
This article surveys bootstrap methods for producing good approximate confidence intervals. The goal is to improve by an order of magnitude upon the accuracy of the standard intervals $\\hat{\\theta} \\pm z^{(\\alpha)} \\hat{\\sigma)$, in a way that allows routine application even to very complicated problems. Both theory and examples are used to show how this is done. The first seven sections provide a heuristic overview of four bootstrap confidence interval procedures: $BC_a$, bootstrap-$t$, ABC and calibration. Sections 8 and 9 describe the theory behind these methods, and their close connection with the likelihood-based confidence interval theory developed by Barndorff-Nielsen, Cox and Reid and others.