Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Series Title
      Series Title
      Clear All
      Series Title
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Content Type
    • Item Type
    • Is Full-Text Available
    • Subject
    • Publisher
    • Source
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
1,390 result(s) for "Escobar, Luis A"
Sort by:
Effects of Antimodularity and Multiscale Influence in Random Boolean Networks
We investigate the effects of modularity, antimodularity, and multiscale influence on random Boolean networks (RBNs). On the one hand, we produced modular, antimodular, and standard RBNs and compared them to identify how antimodularity affects the dynamical behaviors of RBNs. We found that the antimodular networks showed similar dynamics to the standard networks. Confirming previous results, modular networks had more complex dynamics. On the other hand, we generated multilayer RBNs where there are different RBNs in the nodes of a higher scale RBN. We observed the dynamics of micro- and macronetworks by adjusting parameters at each scale to reveal how the behavior of lower layers affects the behavior of higher layers and vice versa. We found that the statistical properties of macro-RBNs were changed by the parameters of micro-RBNs, but not the other way around. However, the precise patterns of networks were dominated by the macro-RBNs. In other words, for statistical properties only upward causation was relevant, while for the detailed dynamics downward causation was prevalent.
Oxidative stress induces early-onset apoptosis of vascular smooth muscle cells and neointima formation in response to injury
The present study dissects the mechanisms underlying the rapid onset of apoptosis that precedes post injury vascular remodelling. Using the rat balloon injury model, we demonstrated that a significant number of arterial vascular smooth muscle cells (VSMC) undergo apoptosis at 90 min after the procedure. This apoptotic wave caused significant loss in media cellularity (>90%) over the next 3 h and was accompanied by a marked accumulation of oxidative stress by-products in the vascular wall. Early apoptotic VSMC were rich in p38 mitogen-activated protein kinase (MAPK) and the transcription factor c-Jun and secreted IL-6 and GRO/KC into the milieu as determined using multiplex bead assays. Neointima thickness increased steadily starting on day 3 as a result of pronounced repopulation of the media. A second apoptotic wave that was detected at 14 days after injury affected mostly the neointima and was insufficient to control hyperplasia. Suppression of reactive oxygen species (ROS) production using either the NAD(P)H oxidase inhibitor VAS2870 or pegylated superoxide dismutase (PEG-SOD) significantly decreased the number of apoptotic cells during the first apoptotic wave and showed a trend towards reduction in the neointima-to-media thickness ratio at 30 days post injury. These results indicate that oxidative stress in response to injury induces early-onset apoptosis of VSMC through the activation of redox-sensible MAPK pro-apoptotic pathways. This remodelling process leads to the local accumulation of inflammatory cytokines and repopulation of the media, which ultimately contribute to neointima formation.
Methods for Planning Repeated Measures Degradation Studies
Repeated measures degradation studies are used to assess product or component reliability when there are few or even no failures expected during a study. Such studies are often used to assess the shelf life of materials, components, and products. We show how to evaluate the properties of proposed test plans. Such evaluations are needed to identify statistically efficient tests. We consider test plans for applications where parameters related to the degradation distribution or the related lifetime distribution are to be estimated. We use the approximate large-sample variance-covariance matrix of the parameters of a mixed effects linear regression model for repeated measures degradation data to assess the effect of sample size (number of units and number of measurements within the units) on estimation precision of both degradation and failure-time distribution quantiles. We also illustrate the complementary use of simulation-based methods for evaluating and comparing test plans. These test-planning methods are illustrated with two examples. We provide the R code and examples as supplementary materials (available online on the journal web site) for this article.
A Review of Accelerated Test Models
Engineers in the manufacturing industries have used accelerated test (AT) experiments for many decades. The purpose of AT experiments is to acquire reliability information quickly. Test units of a material, component, subsystem or entire systems are subjected to higher-than-usual levels of one or more accelerating variables such as temperature or stress. Then the AT results are used to predict life of the units at use conditions. The extrapolation is typically justified (correctly or incorrectly) on the basis of physically motivated models or a combination of empirical model fitting with a sufficient amount of previous experience in testing similar units. The need to extrapolate in both time and the accelerating variables generally necessitates the use of fully parametric models. Statisticians have made important contributions in the development of appropriate stochastic models for AT data [typically a distribution for the response and regression relationships between the parameters of this distribution and the accelerating variable(s)], statistical methods for AT planning (choice of accelerating variable levels and allocation of available test units to those levels) and methods of estimation of suitable reliability metrics. This paper provides a review of many of the AT models that have been used successfully in this area.
Accelerated Degradation Tests: Modeling and Analysis
High reliability systems generally require individual system components having extremely high reliability over long periods of time. Short product development times require reliability tests to be conducted with severe time constraints. Frequently few or no failures occur during such tests, even with acceleration. Thus, it is difficult to assess reliability with traditional life tests that record only failure times. For some components, degradation measures can be taken over time. A relationship between component failure and amount of degradation makes it possible to use degradation models and data to make inferences and predictions about a failure-time distribution. This article describes degradation reliability models that correspond to physical-failure mechanisms. We explain the connection between degradation reliability models and failure-time reliability models. Acceleration is modeled by having an acceleration model that describes the effect that temperature (or another accelerating variable) has on the rate of a failure-causing chemical reaction. Approximate maximum likelihood estimation is used to estimate model parameters from the underlying mixed-effects nonlinear regression model. Simulation-based methods are used to compute confidence intervals for quantities of interest (e.g., failure probabilities). Finally we use a numerical example to compare the results of accelerated degradation analysis and traditional accelerated life-test failure-time analysis.
Analysis of total arsenic content in purchased rice from Ecuador
Natural and anthropogenic sources contribute to arsenic contamination in water and human food chain in Andean countries. Human exposure to arsenic via rice consumption is of great concern in countries where this crop is the dominant staple food, and limited information is available on the arsenic contamination on rice in Ecuador. This work was to contribute to the lack of knowledge analysing total arsenic by hydride generation-atomic absorption spectrometry in the samples of white, brown and parboiled rice purchased in Ecuadorian markets and produced in the two main rice wetlands in Ecuador, Guayas and Los Ríos, were carried out. For the samples from Guayas, arsenic concentration in white, brown and parboiled rice were 0.174 ± 0.014, 0.232 ± 0.021, and 0.186 ± 0.017 mg/kg respectively, whereas samples of white rice from Los Ríos showed a total arsenic level of 0.258 ± 0.037 mg/kg. This last arsenic concentration exceeds recommended maximum permissible limit by the FAO/WHO. Obtained data have available to estimate the Ecuadorian dietary exposure revealing serious health risk for population.
Statistical intervals : a guide for practitioners and researchers
Describes statistical intervals to quantify sampling uncertainty,focusing on key application needs and recently developed methodology in an easy-to-apply format Statistical intervals provide invaluable tools for quantifying sampling uncertainty. The widely hailed first edition, published in 1991, described the use and construction of the most important statistical intervals. Particular emphasis was given to intervals-such as prediction intervals, tolerance intervals and confidence intervals on distribution quantiles-frequently needed in practice, but often neglected in introductory courses. Vastly improved computer capabilities over the past 25 years have resulted in an explosion of the tools readily available to analysts. This second edition-more than double the size of the first-adds these new methods in an easy-to-apply format. In addition to extensive updating of the original chapters, the second edition includes new chapters on: Likelihood-based statistical intervals Nonparametric bootstrap intervals Parametric bootstrap and other simulation-based intervals An introduction to Bayesian intervals Bayesian intervals for the popular binomial, Poisson and normal distributions Statistical intervals for Bayesian hierarchical models Advanced case studies, further illustrating the use of the newly described methods New technical appendices provide justification of the methods and pathways to extensions and further applications. A webpage directs readers to current readily accessible computer software and other useful information. Statistical Intervals: A Guide for Practitioners and Researchers, Second Edition is an up-to-date working guide and reference for all who analyze data, allowing them to quantify the uncertainty in their results using statistical intervals.
Identification of suitable copulas for bivariate frequency analysis of flood peak and flood volume data
Multivariate flood frequency analysis, involving flood peak flow, volume and duration, has been traditionally accomplished by employing available functional bivariate and multivariate frequency distributions that have a restriction on the marginals to be from the same family of distributions. The copula concept overcomes this restriction by allowing a combination of arbitrarily chosen marginal types. It also provides a wider choice of admissible dependence structure as compared to the conventional approach. The availability of a vast variety of copula types makes the selection of an appropriate copula family for different hydrological applications a non-trivial task. Graphical and analytic goodness-of-fit tests for testing the suitability of copulas are beginning to evolve and are being developed; there is limited experience of their usage at present, especially in the hydrological field. This paper provides a step-wise procedure for copula selection and illustrates its application to bivariate flood frequency analysis, involving flood peak flow and volume data. Several graphical procedures, tail dependence characteristics, and formal goodness-of-fit tests involving a parametric bootstrap-based technique are considered while investigating the relative applicability of six copula families. The Clayton copula has been identified as a valid model for the particular flood peak flow and volume data set considered in the study.
Estimating a Parametric Component Lifetime Distribution from a Collection of Superimposed Renewal Processes
Maintenance data can be used to make inferences about the lifetime distribution of system components. Typically, a fleet contains multiple systems. Within each system, there is a set of nominally identical replaceable components of particular interest (e.g., 2 automobile headlights, 8 dual in-line memory module (DIMM) modules in a computing server, 16 cylinders in a locomotive engine). For each component replacement event, there is system-level information that a component was replaced, but no information on which particular component was replaced. Thus, the observed data are a collection of superpositions of renewal processes (SRP), one for each system in the fleet. This article proposes a procedure for estimating the component lifetime distribution using the aggregated event data from a fleet of systems. We show how to compute the likelihood function for the collection of SRPs and provide suggestions for efficient computations. We compare performance of this incomplete-data maximum likelihood (ML) estimator with the complete-data ML estimator and study the performance of confidence interval methods for estimating quantiles of the lifetime distribution of the component. Supplementary materials for this article are available online.