Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Content Type
      Content Type
      Clear All
      Content Type
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Item Type
    • Is Full-Text Available
    • Subject
    • Publisher
    • Source
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
28,979 result(s) for "Stochastic methods"
Sort by:
Imaging features and safety and efficacy of endovascular stroke treatment: a meta-analysis of individual patient-level data
Evidence regarding whether imaging can be used effectively to select patients for endovascular thrombectomy (EVT) is scarce. We aimed to investigate the association between baseline imaging features and safety and efficacy of EVT in acute ischaemic stroke caused by anterior large-vessel occlusion. In this meta-analysis of individual patient-level data, the HERMES collaboration identified in PubMed seven randomised trials in endovascular stroke that compared EVT with standard medical therapy, published between Jan 1, 2010, and Oct 31, 2017. Only trials that required vessel imaging to identify patients with proximal anterior circulation ischaemic stroke and that used predominantly stent retrievers or second-generation neurothrombectomy devices in the EVT group were included. Risk of bias was assessed with the Cochrane handbook methodology. Central investigators, masked to clinical information other than stroke side, categorised baseline imaging features of ischaemic change with the Alberta Stroke Program Early CT Score (ASPECTS) or according to involvement of more than 33% of middle cerebral artery territory, and by thrombus volume, hyperdensity, and collateral status. The primary endpoint was neurological functional disability scored on the modified Rankin Scale (mRS) score at 90 days after randomisation. Safety outcomes included symptomatic intracranial haemorrhage, parenchymal haematoma type 2 within 5 days of randomisation, and mortality within 90 days. For the primary analysis, we used mixed-methods ordinal logistic regression adjusted for age, sex, National Institutes of Health Stroke Scale score at admission, intravenous alteplase, and time from onset to randomisation, and we used interaction terms to test whether imaging categorisation at baseline modifies the association between treatment and outcome. This meta-analysis was prospectively designed by the HERMES executive committee but has not been registered. Among 1764 pooled patients, 871 were allocated to the EVT group and 893 to the control group. Risk of bias was low except in the THRACE study, which used unblinded assessment of outcomes 90 days after randomisation and MRI predominantly as the primary baseline imaging tool. The overall treatment effect favoured EVT (adjusted common odds ratio [cOR] for a shift towards better outcome on the mRS 2·00, 95% CI 1·69–2·38; p<0·0001). EVT achieved better outcomes at 90 days than standard medical therapy alone across a broad range of baseline imaging categories. Mortality at 90 days (14·7% vs 17·3%, p=0·15), symptomatic intracranial haemorrhage (3·8% vs 3·5%, p=0·90), and parenchymal haematoma type 2 (5·6% vs 4·8%, p=0·52) did not differ between the EVT and control groups. No treatment effect modification by baseline imaging features was noted for mortality at 90 days and parenchymal haematoma type 2. Among patients with ASPECTS 0–4, symptomatic intracranial haemorrhage was seen in ten (19%) of 52 patients in the EVT group versus three (5%) of 66 patients in the control group (adjusted cOR 3·94, 95% CI 0·94–16·49; pinteraction=0·025), and among patients with more than 33% involvement of middle cerebral artery territory, symptomatic intracranial haemorrhage was observed in 15 (14%) of 108 patients in the EVT group versus four (4%) of 113 patients in the control group (4·17, 1·30–13·44, pinteraction=0·012). EVT achieves better outcomes at 90 days than standard medical therapy across a broad range of baseline imaging categories, including infarcts affecting more than 33% of middle cerebral artery territory or ASPECTS less than 6, although in these patients the risk of symptomatic intracranial haemorrhage was higher in the EVT group than the control group. This analysis provides preliminary evidence for potential use of EVT in patients with large infarcts at baseline. Medtronic.
A reliable numerical analysis for stochastic dengue epidemic model with incubation period of virus
This article represents a numerical analysis for a stochastic dengue epidemic model with incubation period of virus. We discuss the comparison of solutions between the stochastic dengue model and a deterministic dengue model. In this paper, we have shown that the stochastic dengue epidemic model is more realistic as compared to the deterministic dengue epidemic model. The effect of threshold number R1\\(R_{1}\\) holds in the stochastic dengue epidemic model. If R1<1\\(R_{1} <1\\), then situation helps us to control the disease while R1>1\\(R_{1} >1\\) shows the persistence of disease in population. Unfortunately, the numerical methods like Euler–Maruyama, stochastic Euler, and stochastic Runge–Kutta do not work for large time step sizes. The proposed framework of stochastic nonstandard finite difference scheme (SNSFD) is independent of step size and preserves all the dynamical properties like positivity, boundedness, and dynamical consistency.
Dynamic stochastic projection method for multistage stochastic variational inequalities
Stochastic approximation (SA) type methods have been well studied for solving single-stage stochastic variational inequalities (SVIs). This paper proposes a dynamic stochastic projection method (DSPM) for solving multistage SVIs. In particular, we investigate an inexact single-stage SVI and present an inexact stochastic projection method (ISPM) for solving it. Then we give the DSPM to a three-stage SVI by applying the ISPM to each stage. We show that the DSPM can achieve an O ( 1 ϵ 2 ) convergence rate regarding to the total number of required scenarios for the three-stage SVI. We also extend the DSPM to the multistage SVI when the number of stages is larger than three. The numerical experiments illustrate the effectiveness and efficiency of the DSPM.
Stability analysis of soil-rock slope (SRS) with an improved stochastic method and physical models
With the development of traffic, many highways were built on the top of soil–rock slope (SRS). However, the effect of highway load on the SRS stability has never been studied comprehensively. Therefore, based on the statistical analysis, the stability of SRS considering additional loads of highway was studied. For generating a more realistic slope model, the identification algorithm of rock characteristic parameters was described considering rock ellipticity and long axis inclination angle; the corresponding rock contour establishing method and SRS establishing process were detailed, which could well consider rock content, ellipticity and long axis inclination angle. Applying the stochastic program, 522 stochastic numerical models and corresponding 12 physical models were created to study the influence of rock contents and long axis inclination angles on the SRS stability. The obtained results showed that the additional loads and dispersion degrees of stochastic analyzing results increased with the increase of rock contents, which were related to plastic developing modes (detour, through, scatter and contain modes) of SRS. By adjusting long axis inclination angles of rocks, it was observed that the minimum or maximum additional load was, respectively, obtained when this angle was parallel with or vertical to plastic belt. The effect of long axis inclination angles to the additional load (30.5%, 38.3% and 60.8% for 20%, 40% and 60% rock content) were concluded, which proved the necessity to consider long axis inclination angles of rocks in estimating SRS stability, especially in high rock content. According to numerical analysis results and the failure characteristic of physical models, three typical development modes of plastic belt of SRS were concluded when the load was on the top of slope, including deep, shallow and partial failure of SRS. In addition, it can also be found that the sliding body shows collapse (whole) modes when long axis inclination angles for rocks are vertical (parallel) to the plastic belt.
A Line Search Based Proximal Stochastic Gradient Algorithm with Dynamical Variance Reduction
Many optimization problems arising from machine learning applications can be cast as the minimization of the sum of two functions: the first one typically represents the expected risk, and in practice it is replaced by the empirical risk, and the other one imposes a priori information on the solution. Since in general the first term is differentiable and the second one is convex, proximal gradient methods are very well suited to face such optimization problems. However, when dealing with large-scale machine learning issues, the computation of the full gradient of the differentiable term can be prohibitively expensive by making these algorithms unsuitable. For this reason, proximal stochastic gradient methods have been extensively studied in the optimization area in the last decades. In this paper we develop a proximal stochastic gradient algorithm which is based on two main ingredients. We indeed combine a proper technique to dynamically reduce the variance of the stochastic gradients along the iterative process with a descent condition in expectation for the objective function, aimed to fix the value for the steplength parameter at each iteration. For general objective functionals, the a.s. convergence of the limit points of the sequence generated by the proposed scheme to stationary points can be proved. For convex objective functionals, both the a.s. convergence of the whole sequence of the iterates to a minimum point and an O ( 1 / k ) convergence rate for the objective function values have been shown. The practical implementation of the proposed method does not need neither the computation of the exact gradient of the empirical risk during the iterations nor the tuning of an optimal value for the steplength. An extensive numerical experimentation highlights that the proposed approach appears robust with respect to the setting of the hyperparameters and competitive compared to state-of-the-art methods.
Ground-motion models for earthquakes occurring in the United Kingdom
This article presents models to predict median horizontal elastic response spectral accelerations for 5% damping from earthquakes with moment magnitudes ranging from 3.5 to 7.25 occurring in the United Kingdom. This model was derived using the hybrid stochastic-empirical method based on an existing ground-motion model for California and a stochastic model for the UK that was developed specifically for this purpose. The model is presented in two consistent formats, both for two distance metrics, with different target end-users. Firstly, we provide a complete logic tree with 162 branches, and associated weights, capturing epistemic uncertainties in the depth to the top of rupture, geometric spreading, anelastic path attenuation, site attenuation and stress drop, which is more likely to be used for research. The weights for these branches were derived using Bayesian updating of a priori weights from expert judgment. Secondly, we provide a backbone model with three and five branches corresponding to different percentiles, with corresponding weights, capturing the overall epistemic uncertainty, which is tailored for engineering applications. The derived models are compared with ground-motion observations, both instrumental and macroseismic, from the UK and surrounding region (northern France, Belgium, the Netherlands, western Germany and western Scandinavia). These comparisons show that the model is well-centred (low overall bias and no obvious trends with magnitude or distance) and that the branches capture the body and range of the technically defensible interpretations. In addition, comparisons with ground-motion models that have been previously used within seismic hazard assessments for the UK show that ground-motion predictions from the proposed model match those from previous models quite closely for most magnitudes and distances. The models are available as computer subroutines for ease of use.
Drought disaster risk management based on optimal allocation of water resources
Drought risk management has gradually emerged as an important discipline and the traditional negative drought management changes to active drought management. Drought risk assessment and control are the core of drought risk management. In this study, based on precipitation anomaly (Pa) and soil moisture content anomaly index, the stochastic drought index model was established to calculate the drought distribution under different probability. Considering risk of disaster (H), vulnerability of the environment (S), exposure of the disaster bearing body (V), and disaster prevention and mitigation capability (C), a water resource optimization allocation model based on drought disaster risk assessment model was established to minimize the regional drought disaster risk. The developed models were used in Heilongjiang Province, China, and the results showed that: (1) the drought indexes based on the stochastic method can reflect the regional drought under different probabilities, providing managers with comprehensive drought information to manage the disaster; (2) the optimal allocation of water resources can reduce the risk of drought disaster in drought-prone months and drought-prone areas; and (3) studying drought risk assessment and regulation considering grain yield can be used to effectively understand and alleviate drought effects in the study area, reduce farmers' economic losses and ensure local food security.Graphic abstract
A Proximal Stochastic Quasi-Newton Algorithm with Dynamical Sampling and Stochastic Line Search
In the field of machine learning, many large-scale optimization problems can be decomposed into the sum of two functions: a smooth function and a nonsmooth function with a simple proximal mapping. In light of this, our paper introduces a novel variant of the proximal stochastic quasi-Newton algorithm, grounded in three key components: (i) developing an adaptive sampling method that dynamically increases the sample size during the iteration process, thus preventing rapid growth in sample size and mitigating the noise introduced by the stochastic approximation method; (ii) the integration of stochastic line search to ensure a sufficient decrease in the expected value of the objective function; and (iii) a stable update scheme for the stochastic modified limited-memory Broyden-Fletcher-Goldfarb-Shanno algorithm. For a general objective function, it can be proven that the limit points of the generated sequence almost surely converge to stationary points. Furthermore, the convergence rate and the number of required gradient computations for this process have been analyzed. In the case of a strongly convex objective function, a global linear convergence rate can be achieved, and the number of required gradient computations is thoroughly examined. Finally, numerical experiments demonstrate the robustness of the proposed method across various hyperparameter settings, establishing its competitiveness compared to state-of-the-art methods.
Data-driven testing of the magnitude dependence of earthquake stress parameters using the NGA-West 2 dataset
In this study, we investigate the dependencies between ground-motion intensity measures (GMIM) and earthquake magnitudes (M), in order to evaluate the dynamic stress parameter (Δσ) magnitude scaling. To achieve this, two types of datasets are used: a large subset of the NGA-West 2 (next generation attenuation) dataset including 1700 records from 426 sites and 271 earthquakes. The other datasets are generated through the stochastic method (Boore 2003) assuming various magnitude dependencies (constant and variable) of the stress parameter with magnitude. Adaptive neuro-fuzzy inference systems (ANFIS) are used to derive data-driven ground-motion prediction models (Ameur et al. 2018). Stiff soil (Vs30 > 500 m/s) data are selected and the ground-motion models are depending on two input parameters: the moment magnitude (Mw) and the hypocentral distance (Rhyp). Following Molkenthin et al. (2014), we assume that Δσ is the dominating controlling factor of GMIM for stiff site conditions at Rhyp = 30 km, at the frequency (f) = 3.33 Hz for moderate earthquakes in the magnitude range Mw = [4.5–6.5]. This study confirms that the relations between magnitude and stress parameter control the scaling of ground motions. We show that the magnitude-dependent stress drops better fit the latest generation of NGA-West 2 datasets and empirical ground-motion equations. We finally calibrate a relation between dynamic stress parameter and earthquake magnitude in the magnitude range Mw = [4.5–6.5].