Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
9,836
result(s) for
"Probabilistic methods"
Sort by:
Imaging features and safety and efficacy of endovascular stroke treatment: a meta-analysis of individual patient-level data
by
Ringleb, P
,
Reiff, T
,
Hopyan, J
in
Aged
,
Brain Ischemia - diagnostic imaging
,
Brain Ischemia - pathology
2018
Evidence regarding whether imaging can be used effectively to select patients for endovascular thrombectomy (EVT) is scarce. We aimed to investigate the association between baseline imaging features and safety and efficacy of EVT in acute ischaemic stroke caused by anterior large-vessel occlusion.
In this meta-analysis of individual patient-level data, the HERMES collaboration identified in PubMed seven randomised trials in endovascular stroke that compared EVT with standard medical therapy, published between Jan 1, 2010, and Oct 31, 2017. Only trials that required vessel imaging to identify patients with proximal anterior circulation ischaemic stroke and that used predominantly stent retrievers or second-generation neurothrombectomy devices in the EVT group were included. Risk of bias was assessed with the Cochrane handbook methodology. Central investigators, masked to clinical information other than stroke side, categorised baseline imaging features of ischaemic change with the Alberta Stroke Program Early CT Score (ASPECTS) or according to involvement of more than 33% of middle cerebral artery territory, and by thrombus volume, hyperdensity, and collateral status. The primary endpoint was neurological functional disability scored on the modified Rankin Scale (mRS) score at 90 days after randomisation. Safety outcomes included symptomatic intracranial haemorrhage, parenchymal haematoma type 2 within 5 days of randomisation, and mortality within 90 days. For the primary analysis, we used mixed-methods ordinal logistic regression adjusted for age, sex, National Institutes of Health Stroke Scale score at admission, intravenous alteplase, and time from onset to randomisation, and we used interaction terms to test whether imaging categorisation at baseline modifies the association between treatment and outcome. This meta-analysis was prospectively designed by the HERMES executive committee but has not been registered.
Among 1764 pooled patients, 871 were allocated to the EVT group and 893 to the control group. Risk of bias was low except in the THRACE study, which used unblinded assessment of outcomes 90 days after randomisation and MRI predominantly as the primary baseline imaging tool. The overall treatment effect favoured EVT (adjusted common odds ratio [cOR] for a shift towards better outcome on the mRS 2·00, 95% CI 1·69–2·38; p<0·0001). EVT achieved better outcomes at 90 days than standard medical therapy alone across a broad range of baseline imaging categories. Mortality at 90 days (14·7% vs 17·3%, p=0·15), symptomatic intracranial haemorrhage (3·8% vs 3·5%, p=0·90), and parenchymal haematoma type 2 (5·6% vs 4·8%, p=0·52) did not differ between the EVT and control groups. No treatment effect modification by baseline imaging features was noted for mortality at 90 days and parenchymal haematoma type 2. Among patients with ASPECTS 0–4, symptomatic intracranial haemorrhage was seen in ten (19%) of 52 patients in the EVT group versus three (5%) of 66 patients in the control group (adjusted cOR 3·94, 95% CI 0·94–16·49; pinteraction=0·025), and among patients with more than 33% involvement of middle cerebral artery territory, symptomatic intracranial haemorrhage was observed in 15 (14%) of 108 patients in the EVT group versus four (4%) of 113 patients in the control group (4·17, 1·30–13·44, pinteraction=0·012).
EVT achieves better outcomes at 90 days than standard medical therapy across a broad range of baseline imaging categories, including infarcts affecting more than 33% of middle cerebral artery territory or ASPECTS less than 6, although in these patients the risk of symptomatic intracranial haemorrhage was higher in the EVT group than the control group. This analysis provides preliminary evidence for potential use of EVT in patients with large infarcts at baseline.
Medtronic.
Journal Article
Hybrid CNN-SVM Classifier Approaches to Process Semi-Structured Data in Sugarcane Yield Forecasting Production
by
Rao, N. Thirupathi
,
Joshua, Eali Stephen Neal
,
Kim, Tai-hoon
in
Agricultural economics
,
Agricultural practices
,
Agricultural production
2023
Information communication technology (ICT) breakthroughs have boosted global social and economic progress. Most rural Indians rely on agriculture for income. The growing population requires modern agricultural practices. ICT is crucial for educating farmers on how to be environmentally friendly. It helps them create more food by solving a variety of challenges. India’s sugarcane crop is popular and lucrative. Long-term crops that require water do not need specific soil. They need water; the ground should always have adequate water due to the link between cane growth and evaporation. This research focuses on forecasting soil moisture and classifying sugarcane output; sugarcane has so many applications that it must be categorized. This research examines these claims: The first phase model predicts soil moisture using two-level ensemble classifiers. Secondly, to boost performance, the proposed ensemble model integrates the Gaussian probabilistic method (GPM), the convolutional neural network (CNN), and support vector machines (SVM). The suggested approach aims to correctly anticipate future soil moisture measurements affecting crop growth and cultivation. The proposed model is 89.53% more accurate than conventional neural network classifiers. The recommended models’ outcomes will assist farmers and agricultural authorities in boosting production.
Journal Article
Bayesian Probabilistic Numerical Methods in Time-Dependent State Estimation for Industrial Hydrocyclone Equipment
by
Oates, Chris J.
,
Cockayne, Jon
,
Girolami, Mark
in
Applications and Case Studies
,
Bayesian analysis
,
Bayesian theory
2019
The use of high-power industrial equipment, such as large-scale mixing equipment or a hydrocyclone for separation of particles in liquid suspension, demands careful monitoring to ensure correct operation. The fundamental task of state-estimation for the liquid suspension can be posed as a time-evolving inverse problem and solved with Bayesian statistical methods. In this article, we extend Bayesian methods to incorporate statistical models for the error that is incurred in the numerical solution of the physical governing equations. This enables full uncertainty quantification within a principled computation-precision trade-off, in contrast to the over-confident inferences that are obtained when all sources of numerical error are ignored. The method is cast within a sequential Monte Carlo framework and an optimized implementation is provided in Python.
Journal Article
Influence of input uncertainty on the 1-D hygrothermal simulation of composite walls in China
by
Fang, Jinzhong
,
Feng, Chi
,
Gan, Weinan
in
Atmospheric Protection/Air Quality Control/Air Pollution
,
Building Construction and Design
,
Condensation
2025
In hygrothermal simulations, uncertainties in input parameters affect the reliability of outputs, potentially leading to erroneous judgments about building performance. This study proposes a probabilistic method to comprehensively evaluate input uncertainties in the simulated hygrothermal performance of 1-D composite walls in five thermal design zones in China. Specifically, input parameters are categorized into discrete (e.g., insulation type) and continuous (e.g., thermal conductivity) parameters. The discrete parameters form 160 basic simulation scenarios. For each scenario, 400 uniform random samples are generated within given ranges of continuous parameters using Latin hypercube sampling, resulting in 64,000 simulation cases. Heat flux, moisture flux, moisture content, interstitial condensation, mold growth and frost damage are used as indicators for the hygrothermal performance, and multiple linear regression analysis is used to determine the most influential continuous parameters. The results indicate that the thermal conductivity of insulation materials has the greatest influence on the average heat flux, interstitial condensation risk, mold growth risk, and freeze/thaw cycles. The moisture retention curve and vapor diffusion resistance factor of structural materials significantly influence the average moisture flux. The moisture retention curve of structural materials has the most influence on the average moisture content of the structural layer. This study elucidates the most critical input parameters for different thermal design zones in China and other similar climate regions.
Journal Article
Susceptibility Analysis of the Mt. Umyeon Landslide Area Using a Physical Slope Model and Probabilistic Method
2020
Every year, many countries carry out landslide susceptibility analyses to establish and manage countermeasures and reduce the damage caused by landslides. Because increases in the areas of landslides lead to new landslides, there is a growing need for landslide prediction to reduce such damage. Among the various methods for landslide susceptibility analysis, statistical methods require information about the landslide occurrence point. Meanwhile, analysis based on physical slope models can estimate stability by considering the slope characteristics, which can be applied based on information about the locations of landslides. Therefore, in this study, a probabilistic method based on a physical slope model was developed to analyze landslide susceptibility. To this end, an infinite slope model was used as the physical slope model, and Monte Carlo simulation was applied based on landslide inventory including landslide locations, elevation, slope gradient, specific catchment area (SCA), soil thickness, unit weight, cohesion, friction angle, hydraulic conductivity, and rainfall intensity; deterministic analysis was also performed for the comparison. The Mt. Umyeon area, a representative case for urban landslides in South Korea where large scale human damage occurred in 2011, was selected for a case study. The landslide prediction rate and receiver operating characteristic (ROC) curve were used to estimate the prediction accuracy so that we could compare our approach to the deterministic analysis. The landslide prediction rate of the deterministic analysis was 81.55%; in the case of the Monte Carlo simulation, when the failure probabilities were set to 1%, 5%, and 10%, the landslide prediction rates were 95.15%, 91.26%, and 90.29%, respectively, which were higher than the rate of the deterministic analysis. Finally, according to the area under the curve of the ROC curve, the prediction accuracy of the probabilistic model was 73.32%, likely due to the variability and uncertainty in the input variables.
Journal Article
Comparative Assessment of the Reliability of Non-Recoverable Subsystems of Mining Electronic Equipment Using Various Computational Methods
by
Pogrebnoy, Alexander V.
,
Kondratiev, Viktor V.
,
Kurdyumov, Georgy E.
in
Accuracy
,
Algorithms
,
Approximation
2026
The assessment of reliability in non-repairable subsystems of mining electronic equipment represents a computationally challenging problem, particularly for complex and highly connected structures. This study presents a systematic comparative analysis of several deterministic approaches for reliability estimation, focusing on their computational efficiency, accuracy, and applicability. The investigated methods include classical boundary techniques (minimal paths and cuts), analytical decomposition based on the Bayes theorem, the logic–probabilistic method (LPM) employing triangle–star transformations, and the algorithmic Structure Convolution Method (SCM), which is based on matrix reduction of the system’s connectivity graph. The reliability problem is formally represented using graph theory, where each element is modeled as a binary variable with independent failures, which is a standard and practically justified assumption for power electronic subsystems operating without common-cause coupling. Numerical experiments were carried out on canonical benchmark topologies—bridge, tree, grid, and random connected graphs—representing different levels of structural complexity. The results demonstrate that the SCM achieves exact reliability values with up to six orders of magnitude acceleration compared to the LPM for systems containing more than 20 elements, while maintaining polynomial computational complexity. Qualitatively, the compared approaches differ in the nature of the output and practical applicability: boundary methods provide fast interval estimates suitable for preliminary screening, whereas decomposition may exhibit a systematic bias for highly connected (non-series–parallel) topologies. In contrast, the SCM consistently preserves exactness while remaining computationally tractable for medium and large sparse-to-moderately dense graphs, making it preferable for repeated recalculations in design and optimization workflows. The methods were implemented in Python 3.7 using NumPy and NetworkX, ensuring transparency and reproducibility. The findings confirm that the SCM is an efficient, scalable, and mathematically rigorous tool for reliability assessment and structural optimization of large-scale non-repairable systems. The presented methodology provides practical guidelines for selecting appropriate reliability evaluation techniques based on system complexity and computational resource constraints.
Journal Article
A Meeting Point of Probability, Graphs, and Algorithms: The Lovász Local Lemma and Related Results—A Survey
2021
A classic and fundamental result, known as the Lovász Local Lemma, is a gem in the probabilistic method of combinatorics. At a high level, its core message can be described by the claim that weakly dependent events behave similarly to independent ones. A fascinating feature of this result is that even though it is a purely probabilistic statement, it provides a valuable and versatile tool for proving completely deterministic theorems. The Lovász Local Lemma has found many applications; despite being originally published in 1973, it still attracts active novel research. In this survey paper, we review various forms of the Lemma, as well as some related results and applications.
Journal Article
Small Simplicial Complexes with Prescribed Torsion in Homology
2019
For \\[d \\ge 2\\] and G a finite abelian group, define \\[T_d(G)\\] to be the minimum number of vertices n so that there exists a simplicial complex X on n vertices which has the torsion part of \\[H_{d - 1}(X)\\] isomorphic to G. Here we use the probabilistic method, in particular the Lovász Local Lemma, to establish an upper bound on \\[T_d(G)\\] which matches the known lower bound up to a constant factor. That is, we prove that for every \\[d \\ge 2\\] there exist constants \\[c_d\\] and \\[C_d\\] so that for any finite abelian group \\[\\begin{aligned} c_d(\\log |G|)^{1/d} \\le T_d(G) \\le C_d(\\log |G|)^{1/d}. \\end{aligned}\\]
Journal Article
Probabilistic estimation of earthquake source location and magnitude using inverse analysis of regional paleoliquefaction studies
by
Kumar, Ritesh
,
Kanth, Aparna
,
Bishoyi, Nitarani
in
Cohesionless soils
,
Data analysis
,
Earthquake engineering
2023
Liquefaction is one of the most significant and remarkable causes of ground failure in geotechnical earthquake engineering. The phenomenon mostly occurs in saturated cohesionless soils when subjected to seismic loading. Studies on past liquefaction evidence, also known as paleoliquefaction studies, have helped several researchers predict a particular region’s future vulnerabilities. However, it is always difficult to prepare human life for future devastation resulting from ground failures. But, prior estimation of the magnitude and likelihood of earthquakes that may strike a location shortly can create an environment involving fewer risk factors. Several methods are available to back-calculate the strength of shaking and earthquake magnitude from seismic evidence, such as paleoliquefaction. Knowing the origin of an earthquake aids in locating the fault zone. As a result of these historical investigations, information for seismic hazard analyses and ground motion forecasts for a particular region becomes possible. The present study is designed on similar grounds to carry out the investigation. A total of nine sites are selected in the Roorkee region, which is vulnerable to earthquakes. The region is also prone to liquefaction based on experimental evidence available from the past studies. The Standard Penetration Test data analysis performed on all nine sites is used for site characterization. For probabilistic earthquake source characterization, magnitudes between 3.5 and 8.5 and PGA between 0.05 and 0.5 are considered. For the interpretation of the most likely source location and its corresponding likelihood of magnitude, both site and source data are utilized in the ground motion model. The findings show that with the increase in source-to-site distance, the likelihood of source occurrence reduces, whereas the most likely magnitude increases. Eventually, this framework illustrates a probabilistic method for determining the seismic source parameters based on paleoliquefaction inverse analyses.
Journal Article
A simplified vine copula-based probabilistic method for quantifying multi-dimensional ecological niches and niche overlap: take a three-dimensional case as an example
2024
For quantifying m-dimensional (m≥3) niche regions and niche overlaps using a copula-based approach, commonly used copulas, including Archimedean and elliptical copula families, are unsatisfactory alternatives in characterizing a complex dependence structure among multiple variables, especially when bi-variate copulas characterizing dependency structures of two-dimensional sub-variables differ. To solve the problem, we improve the copula-based niche space modeling approach using simplified vine copulas, a powerful tool containing various bi-variate dependence structures in one multivariate copula. Using four simulated data sets, we then check the performance of simplified vine copula approximation when the simplifying assumption is invalid. Finally, we apply the improved copula-based approach to quantifying a three-dimensional niche space of a real case of Swanson et al. (Ecology 96(2):318–324, 2015. https://doi.org/10.1890/14-0235.1) and discover that among various simplified vine and other flexible multi-dimensional copulas, non-parametric simplified vine copula approximation performs best in fitting the data set. In the discussion, to analyze differences in calculating niche overlaps caused by using different copulas, we compare non-parametric simplified vine copula approximation with non-parametric and parametric simplified vine copula approximation, elliptical copula, Hierarchical Archimedean copula estimation, and empirical beta copula and give some comments on the results.
Journal Article