Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
1,270 result(s) for "bootstrap technique"
Sort by:
Linking Demography and Consumption of Henosepilachna vigintioctopunctata (Coleoptera: Coccinellidae) Fed on Solanum photeinocarpum (Solanales: Solanaceae): With a New Method to Project the Uncertainty of Population Growth and Consumption
Because life tables are capable of providing the most comprehensive description on the survival, stage differentiation, and the reproduction of animal populations, they can be considered as the bases of population ecology and pest management. Researchers concerned with studies involving life tables inevitably face the problem of describing the variabilities that occur in the survival, stage differentiation, and fecundity data. Finding a means to include these variabilities in population projections concerning pest management may be problematic. Henosepilachna vigintioctopunctata (F.) (Coleoptera: Coccinellidae) is a pest of many plant species in Asia, including cultivated crops, ornamentals, and wild plants. The raw life history data (survival, stage differentiation, and fecundity) and consumption rate of both sexes of H. vigintioctopunctata reared on Solanum photeinocarpum Nakamura et Odashima (Solanales: Solanaceae) were collected in the laboratory and analyzed based on the age-stage, two-sex life table theory.The intrinsic rate of increase (r), finite rate of increase (λ ), net reproductive rate (R0), mean generation time (T), and net consumption rate (C0) of H. vigintioctopunctata were 0.1312 d–1, 1.1402 d–1, 603.5 offspring, 48.8 d, and 77.8 cm2, respectively. By using the bootstrap technique with 100,000 samples, we demonstrated that the life tables constructed based on the 2.5th and 97.5th percentiles of R0 and λ can be used to describe the variabilities found in the survival and fecundity curves and to project the uncertainty of population growth.
Development and Reproductive Capacity of the Miyake Spider Mite Eotetranychus kankitus (Acari: Tetranychidae) at Different Temperatures
Eotetranychus kankitus (Acari: Tetranychidae) is an important pest of citrus. Assessing life history parameters is crucial to developing an ecologically sound pest management program. Of the many factors that affect life history parameters of herbivorous insects and mites, temperature has the greatest influence on development rate and reproductive potential. We investigated the effects of temperatures from 15 to 40 °C on the demographic parameters of E. kankitus under a long-day (16:8 (L:D) h) photoperiod. The egg-to-adult development time of E. kankitus decreased as the temperature increased from 15 to 32.5 °C. At 35 °C, the female laid eggs that died at the larval stage. The estimated lower thermal thresholds (t0) were 11.01 and 10.48 °C, and the thermal constants (K) were 190.67 and 188.63 degree-days for egg-to-adult females and egg-to-adult males, respectively. The intrinsic optimal temperatures (TØ) for development were 21.79 and 21.74 °C, respectively. The bootstrap-match technique was used in the construction of the life table paramaters. The net reproductive rate (R0) decreased as temperature increased from 20 to 30 °C, but the lowest rate was observed at 15 °C. The intrinsic rate of natural increase (r) increased from 0.0299 day−1 at 15 °C to 0.1822 day−1 at 30 °C. These findings provide a critical theoretical basis for predicting the occurrence of E. kankitus populations under climate warming and for developing appropriate control strategies.
An advanced framework for tolerance analysis of cam-clamping devices integrating unified Jacobian–Torsor model, Monte Carlo simulation, and bootstrap technique
Uncertainty analysis is essential for estimating variability within specified tolerances, particularly in three-dimensional (3D) assembly tolerance analysis. This study introduces a novel analytical approach for assessing assembly deviations, integrating the Jacobian–Torsor model with the bootstrap technique. The Jacobian–Torsor model combines the efficiency of representing tolerances with the adaptability of the Jacobian matrix for their propagation. This computerized method, based on the unified Jacobian–Torsor approach, focuses on cam-clamping devices, specifically the fastening flange component. The novelty of this study lies in the application of the bootstrap technique, a Monte Carlo Simulation approach, for uncertainty analysis to estimate variability within specified tolerances. A comprehensive comparison of statistical methods—bootstrap, stratified sampling, Bayesian statistics, and analytical methods—demonstrates the advantages of the Bootstrap approach. The results emphasize its user-friendliness and precision, even with complex shapes. The primary aim is to highlight the utility of the unified Jacobian–Torsor method for tolerance analysis. An experiment involving the fastening flange assembly illustrates the practical application of this approach. The findings confirm the effectiveness of the proposed method, demonstrating its accuracy and reliability for cam-clamping devices in real-world assembly scenarios with intricate geometries.
Technical variability of cornea parameters derived from anterior segment OCT fitted with Fringe Zernike polynomials
Background This study uses bootstrapping to evaluate the technical variability (in terms of model parameter variation) of Zernike corneal surface fit parameters based on Casia2 biometric data. Methods Using a dataset containing N = 6953 Casia2 biometric measurements from a cataractous population, a Fringe Zernike polynomial surface of radial degree 10 (36 components) was fitted to the height data. The fit error (height – reconstruction) was bootstrapped 100 times after normalisation. After reversal of normalisation, the bootstrapped fit errors were added to the reconstructed height, and characteristic surface parameters (flat/steep axis, radii, and asphericities in both axes) extracted. The median parameters refer to a robust surface representation for later estimates of elevation, whereas the SD of the 100 bootstraps refers to the variability of the surface fit. Results Bootstrapping gave median radius and asphericity values of 7.74/7.68 mm and −0.20/−0.24 for the corneal front surface in the flat/steep meridian and 6.52/6.37 mm and −0.22/−0.31 for the corneal back surface. The respective SD values for the 100 bootstraps were 0.0032/0.0028 mm and 0.0093/0.0082 for the front and 0.0126/0.0115 mm and 0.0366/0.0312 for the back surface. The uncertainties for the back surface are systematically larger as compared to the uncertainties of the front surface. Conclusion As measured with the Casia2 tomographer, the fit parameters for the corneal back surface exhibit a larger degree of variability compared with those for the front surface. Further studies are needed to show whether these uncertainties are representative for the situation where actual repeat measurements are possible.
Neural Network Analysis of Factors Influencing Forgotten Financial Remittances in Yemen
Remittances are vital for economic growth, especially in developing nations like Yemen. However, the phenomenon of forgotten financial remittances poses significant threats to Yemen's financial stability, as these unreceived transfers negatively impact its economy. Therefore, this study aims to identify the main causes of this phenomenon. These causes include logistical and economic factors of senders and beneficiaries, and features of banks and money exchange firms. The study surveyed 931 Yemeni respondents who experienced forgotten financial remittances, answering 15 possible reasons with \"yes\" or \"no\" in the questionnaire. Descriptive statistical measures were employed to characterize the sample, while neural network analysis (NNA) primarily identified the main factors contributing to forgotten financial transfers. The analysis revealed three key dimensions leading to this problem. The first dimension involves communication or access problems between the sender and recipient. The second dimension involves logistical obstacles that hinder remittance flows, including technological, financial, administrative, and bureaucratic challenges. These barriers can be associated with the senders, recipients, or exchange companies. The final dimension is related to the operations of money exchange companies and the uniformity of their rates. To ensure the stability of the results, Bootstrapping technique was utilized based on a random sample of size 500 observations from the original dataset. Thus, the results demonstrated stability and reliability for all samples larger than 30% of the original sample size.
Reservoir Inflow Prediction by Ensembling Wavelet and Bootstrap Techniques to Multiple Linear Regression Model
In this study, a new hybrid model, bootstrap multiple linear regression (BMLR) is suggested to investigate the potential of bootstrap resampling technique for daily reservoir inflow prediction. The proposed model compares with three other models: Multiple linear regression (MLR), wavelet multiple linear regression (WMLR) and wavelet bootstrap multiple linear regression (WBMLR). River stage data of monsoon season (1st July 2010 to 30 September 2010) from three gauging stations of Chenab river basin are used. In wavelet transformation, input vectors are decomposed using discrete wavelet transformation (DWT) into discrete wavelet components (DWCs). Then suitable DWCs are used to provide input to MLR model to develop WMLR model. Bootstrap technique coupled with MLR model to build up BMLR model. While WBMLR model is the conjunction of suitable DWCs and bootstrap technique to MLR model. Performance indices namely root mean square error (RMSE), mean absolute error (MAE), Nash-Sutcliffe coefficient of efficiency (NSC), and persistence index (CP) are used in study to evaluate the performance of model. Results showed that hybrid model BMLR produce significantly better results on performance indices than other models MLR, WMLR and WBMLR.
Dissolved gas analysis method based on novel feature prioritisation and support vector machine
Dissolved gas analysis (DGA) has been widely used for the detection of incipient faults in oil-filled transformers. This research presents a novel approach to DGA feature prioritisation and classification, which considers not only the relations between a fault type and specific gas ratios but also their statistical characteristics based on data derived from onsite inspections. Firstly, new gas features are acquired based on the analysis of current international gas interpretation standards. Combined with conventional gas ratios, all features are then prioritised by using the Kolmogorov–Smirnov test. The rankings are obtained by using their values of maximum statistic distance. The first three features in ranking are employed as input vectors to a multi-layer support vector machine, whose tuning parameters are acquired by particle swarm optimisation. In the experiment, a bootstrap technique is implemented to approximately equalise sample numbers of different fault cases. A common 10-fold cross-validation technique is employed for performance assessment. Typical artificial intelligence classifiers with gas features extracted from genetic programming are evaluated for comparison purposes.
A proposed methodology for deriving tsunami fragility functions for buildings using optimum intensity measures
Tsunami fragility curves are statistical models which form a key component of tsunami risk models, as they provide a probabilistic link between a tsunami intensity measure (TIM) and building damage. Existing studies apply different TIMs (e.g. depth, velocity, force etc.) with conflicting recommendations of which to use. This paper presents a rigorous methodology using advanced statistical methods for the selection of the optimal TIM for fragility function derivation for any given dataset. This methodology is demonstrated using a unique, detailed, disaggregated damage dataset from the 2011 Great East Japan earthquake and tsunami (total 67,125 buildings), identifying the optimum TIM for describing observed damage for the case study locations. This paper first presents the proposed methodology, which is broken into three steps: (1) exploratory analysis, (2) statistical model selection and trend analysis and (3) comparison and selection of TIMs. The case study dataset is then presented, and the methodology is then applied to this dataset. In Step 1, exploratory analysis on the case study dataset suggests that fragility curves should be constructed for the sub-categories of engineered (RC and steel) and non-engineered (wood and masonry) construction materials. It is shown that the exclusion of buildings of unknown construction material (common practice in existing studies) may introduce bias in the results; hence, these buildings are estimated as engineered or non-engineered through use of multiple imputation (MI) techniques. In Step 2, a sensitivity analysis of several statistical methods for fragility curve derivation is conducted in order to select multiple statistical models with which to conduct further exploratory analysis and the TIM comparison (to draw conclusions which are non-model-specific). Methods of data aggregation and ordinary least squares parameter estimation (both used in existing studies) are rejected as they are quantitatively shown to reduce fragility curve accuracy and increase uncertainty. Partially ordered probit models and generalised additive models (GAMs) are selected for the TIM comparison of Step 3. In Step 3, fragility curves are then constructed for a number of TIMs, obtained from numerical simulation of the tsunami inundation of the 2011 GEJE. These fragility curves are compared using K -fold cross-validation (KFCV), and it is found that for the case study dataset a force-based measure that considers different flow regimes (indicated by Froude number) proves the most efficient TIM. It is recommended that the methodology proposed in this paper be applied for defining future fragility functions based on optimum TIMs. With the introduction of several concepts novel to the field of fragility assessment (MI, GAMs, KFCV for model optimisation and comparison), this study has significant implications for the future generation of empirical and analytical fragility functions.
Shallow landslides susceptibility assessment in different environments
The spatial distribution of shallow landslides is strongly influenced by different climatic conditions and environmental settings. This makes difficult the implementation of an exhaustive monitoring technique for correctly assessing the landslide susceptibility in different environmental contexts. In this work, a unique methodological strategy, based on the statistical implementation of the generalized additive model (GAM), was performed. This method was used to investigate the shallow landslide predisposition of four sites with different geological, geomorphological and land-use characteristics: the Rio Frate and the Versa catchments (Southern Lombardy) and the Vernazza and the Pogliaschina catchments (Eastern Liguria). A good predictive overall accuracy was evaluated computing by the area under the ROC curve (AUROC), with values ranging from 0.76 to 0.82 and estimating the mean accuracy of the model (0.70-0.75). The method showed a high flexibility, which led to a good identification of the most significant predisposing factors for shallow landslide occurrence in the different investigated areas. In particular, detailed susceptibility maps were obtained, allowing to identify the shallow landslide prone areas. This methodology combined with the use of the rainfall thresholds for triggering shallow landslides may provide an innovative tool useful for the improvement of spatial planning and early warning systems.
Estimation of Association between Healthcare System Efficiency and Policy Factors for Public Health
Objective: To assess the association between the healthcare system’s efficiency and policy factors (the types of healthcare systems and various health policy indicators). Methods: In this study, a data envelopment analysis (DEA) with bootstrapping was applied to the healthcare system’s efficiency to correct the bias of efficiency scores and to rank countries appropriately. We analyzed data mainly from the OECD (Organization for Economic Co-operation and Development) Health Data from 2014. After obtaining the efficiency score result, we analyzed which policy factor caused the inefficiency of the healthcare system by Tobit Regression. Results: Based on five types of healthcare system classification, the result suggested that the social health insurance (e.g., Austria, Germany, Switzerland) showed the lowest efficiency score on average when compared to other types of systems, but evidence of a statistically significant difference in healthcare efficiency among four types of healthcare systems was not found. It was shown that the pure technological efficiency of the healthcare system was negatively influenced by two main factors: user choice for basic insurance coverage and degree of decentralization to sub-national governments. Conclusions: Our findings suggest that countries with relatively low healthcare system efficiency may learn from countries that implement policies related to a low level of user choice and a high level of centralization to achieve more economical allocation of their healthcare resources.