Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Language
      Language
      Clear All
      Language
  • Subject
      Subject
      Clear All
      Subject
  • Item Type
      Item Type
      Clear All
      Item Type
  • Discipline
      Discipline
      Clear All
      Discipline
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
150,057 result(s) for "RISK OF INFECTION"
Sort by:
Characterizing the HIV/AIDS epidemic in the Middle East and North Africa : time for strategic action
Despite a fair amount of progress on understanding human immunodeficiency virus (HIV) epidemiology globally, the Middle East and North Africa (MENA) region is the only region where knowledge of the epidemic continues to be very limited, and subject to much controversy. It has been more than 25 years since the discovery of HIV, but no scientific study has provided a comprehensive data-driven synthesis of HIV/AIDS (acquired immunodeficiency syndrome) infectious spread in this region. The current report provides the first comprehensive scientific assessment and data-driven epidemiological synthesis of HIV spread in MENA since the beginning of the epidemic. It is based on a literature review and analysis of thousands of widely unrecognized publications, reports, and data sources extracted from scientific literature or collected from sources at the local, national, and regional levels. The recommendations provided here focus on key strategies related to the scope of this report and its emphasis on understanding HIV epidemiology in MENA as a whole. The recommendations are based on identifying the status of the HIV epidemic in MENA, through this synthesis, as a low HIV prevalence setting with rising concentrated epidemics among priority populations. General directions for prevention interventions as warranted by the outcome of this synthesis are also discussed briefly, but are not delineated because they are beyond the scope of this report. This report was not intended to provide intervention recommendations for each MENA country.
Annual risk of hepatitis E virus infection and seroreversion: Insights from a serological cohort in Sitakunda, Bangladesh
Hepatitis E virus (HEV) is a major cause of acute jaundice in South Asia. Gaps in our understanding of transmission are driven by non-specific symptoms and scarcity of diagnostics, impeding rational control strategies. In this context, serological data can provide important proxy measures of infection. We enrolled a population-representative serological cohort of 2,337 individuals in Sitakunda, Bangladesh. We estimated the annual risks of HEV infection and seroreversion both using serostatus changes between paired serum samples collected 9 months apart, and by fitting catalytic models to the age-stratified cross-sectional seroprevalence. At baseline, 15% (95 CI: 14–17%) of people were seropositive, with seroprevalence highest in the relatively urban south. During the study, 27 individuals seroreverted (annual seroreversion risk: 15%, 95 CI: 10–21%), and 38 seroconverted (annual infection risk: 3%, 95CI: 2–5%). Relying on cross-sectional seroprevalence data alone, and ignoring seroreversion, underestimated the annual infection risk five-fold (0.6%, 95 CrI: 0.5–0.6%). When we accounted for the observed seroreversion in a reversible catalytic model, infection risk was more consistent with measured seroincidence. Our results quantify HEV infection risk in Sitakunda and highlight the importance of accounting for seroreversion when estimating infection incidence from cross-sectional seroprevalence data.
Sample size efficiency of restricting participation in tuberculosis vaccine trials to interferon-gamma release assay-positive participants
A common approach to reducing sample sizes for late-stage tuberculosis vaccine trials is to restrict enrolment to interferon-gamma release assay (IGRA)-positive participants to maximize tuberculosis case accrual. The efficiency gain, if any, from this screening strategy is unknown. We estimated the age specific IGRA positivity prevalence for transmission levels generally considered in tuberculosis vaccine trials (annual risk of tuberculosis infection [ARTI] 2–6 %) and calculated the expected tuberculosis incidence at each age by IGRA status, using a difference equation model. We modelled scenarios that assumed constant or increasing ARTI during adolescence and differing levels of partial protection afforded by previous Mycobacterium tuberculosis infection. We then estimated sample size requirements for tuberculosis vaccine trials enrolling only IGRA-positive participants or participants without prior IGRA testing (‘Mixed’ trial). We assumed participants were 15–44 years at enrolment and followed-up for 3 years. Estimated tuberculosis incidence was 4.7 times higher in IGRA-positive compared to IGRA-negative participants at age 15 years, but 0.9 times lower at age 44 years (assuming ARTI 4 %). This age-cohort effect was exacerbated when assuming partial protection and attenuated when assuming increasing ARTI during adolescence. In a model that included both these assumptions, the sample size required for a Mixed trial compared to that for an IGRA-positive participants-only trial was 124 % larger at 2 % ARTI, 36 % larger at 4 % ARTI but only 8 % larger at 6 % ARTI. Prioritizing enrolment of participants aged 15–29 years improved sample size efficiency for an IGRA-positive participants-only trial. These results were largely unaffected by our model assumptions. In late-stage tuberculosis vaccine trials among adults and adolescents, pre-enrolment screening by IGRA testing provides a large sample size efficiency when M. tuberculosis transmission levels are relatively low, but modest or no sample size benefits at high transmission levels. •Late-stage trials of new tuberculosis vaccines in adults/adolescents require large sample sizes.•We modelled the sample size efficiency of trials with and without infection screening by IGRA.•In high-incidence sites there is little efficiency gain in only enrolling infected participants.•Enrolling relatively young participants reduces the required sample size.
Estimation and evaluation of the risks of protozoa infections associated to the water from a treatment plant in southern Brazil using the Quantitative Microbiological Risk Assessment Methodology (QMRA)
In this study, the Quantitative Microbial Risk Assessment (QMRA) methodology was applied to estimate the annual risk of Giardia and Cryptosporidium infection associated with a water treatment plant in southern Brazil. The efficiency of the treatment plant in removing protozoa and the effectiveness of the Brazilian legislation on microbiological protection were evaluated, emphasizing the relevance of implementing the QMRA in this context. Two distinct approaches were employed to estimate the mechanical removal of protozoa: The definitions provided by the United States Environmental Protection Agency (USEPA), and the model proposed by Neminski and Ongerth. Although the raw water collected had a higher concentration of Giardia cysts than Cryptosporidium oocysts, the estimated values for the annual risk of infection were significantly higher for Cryptosporidium than for Giardia . From a general perspective, the risk values of protozoa infection were either below or very near the limit set by the World Health Organization (WHO). In contrast, all the risk values of Cryptosporidium infection exceeded the threshold established by the USEPA. Ultimately, it was concluded that the implementation of the QMRA methodology should be considered by the Brazilian authorities, as the requirements and guidelines provided by the Brazilian legislation proved to be insufficient to guarantee the microbiological safety of drinking water. In this context, the QMRA application can effectively contribute to the prevention and investigation of outbreaks of waterborne disease.
PROM at term: when might be the best time to induce labour? A retrospective analysis
Purpose PROM after 37 weeks of gestation occurs in approximately 10% of pregnancies. When spontaneous onset of labour does not follow, induction is recommended to decrease the risk of infection for both mother and child. However, there is no clear consensus on whether induction before 24 h after PROM results in fewer complications compared to induction after > 24 h. Material and methods This retrospective observational study analysed the outcomes of 3174 women with PROM admitted to the delivery room of LMU Women's Hospital between 10/2015 and 09/2020. We evaluated whether timing of labour induction was associated with maternal or newborn postpartum infection rates. Results Comparing women with spontaneous onset of labour to those who underwent induction, no significant differences were found in maternal CRP or leukocyte levels, fever, endometritis, or Group B streptococcus colonization. However, intrapartum antibiotic therapy was significantly higher in the induction group. When the induction group was subdivided based on the interval from PROM to induction, no significant differences were observed in maternal infection parameters, need for antibiotics, postpartum length of hospital stay, or endometritis. For newborn infections, a significant difference in CRP levels was found, with higher levels in the groups with “induction < 12 h” and “> 24 h”. Conclusion The presented data suggests that waiting for spontaneous contractions within the first 24 h after PROM was not associated with the risk of infection if no initial signs for infection are present. However, beyond 24 h, the risk of infection increased. These findings support current recommendations regarding the timing of induction after PROM.
Employment conditions as barriers to the adoption of COVID-19 mitigation measures: how the COVID-19 pandemic may be deepening health disparities among low-income earners and essential workers in the United States
Background The COVID-19 pandemic has disproportionately impacted economically-disadvantaged populations in the United States (US). Precarious employment conditions may contribute to these disparities by impeding workers in such conditions from adopting COVID-19 mitigation measures to reduce infection risk. This study investigated the relationship between employment and economic conditions and the adoption of COVID-19 protective behaviors among US workers during the initial phase of the COVID-19 pandemic. Methods Employing a social media advertisement campaign, an online, self-administered survey was used to collect data from 2,845 working adults in April 2020. Hierarchical generalized linear models were performed to assess the differences in engagement with recommended protective behaviors based on employment and economic conditions, while controlling for knowledge and perceived threat of COVID-19, as would be predicted by the Health Belief Model (HBM). Results Essential workers had more precarious employment and economic conditions than non-essential workers: 67% had variable income; 30% did not have paid sick leave; 42% had lost income due to COVID-19, and 15% were food insecure. The adoption of protective behaviors was high in the sample: 77% of participants avoided leaving home, and 93% increased hand hygiene. Consistent with the HBM, COVID-19 knowledge scores and perceived threat were positively associated with engaging in all protective behaviors. However, after controlling for these, essential workers were 60% and 70% less likely than non-essential workers, who by the nature of their jobs cannot stay at home, to stay at home and increase hand hygiene, respectively. Similarly, participants who could not afford to quarantine were 50% less likely to avoid leaving home (AOR: 0.5; 95% CI: 0.4, 0.6) than those who could, whereas there were no significant differences concerning hand hygiene. Conclusions Our findings are consistent with the accumulating evidence that the employment conditions of essential workers and other low-income earners are precarious, that they have experienced disproportionately higher rates of income loss during the initial phase of the COVID-19 pandemic and face significant barriers to adopting protective measures. Our findings underscore the importance and need of policy responses focusing on expanding social protection and benefits to prevent the further deepening of existing health disparities in the US.
Diabetes, Glycemic Control, and Risk of Infection Morbidity and Mortality: A Cohort Study
Abstract Objective Diabetic patients have an elevated risk of infection, but the optimal level of glycemic control with the lowest infection risk remains unclear, especially among the elderly. We aimed to investigate the relation between fasting plasma glucose (FPG) level and risk of infection-related morbidity and mortality. Method The participants were from a community-based health screening program in northern Taiwan during 2005–2008 (n = 118 645) and were followed up until 2014. Incidence of hospitalization for infection and infection-related death was ascertained from the National Health Insurance Database and National Death Registry. Cox proportional hazards regression modelling was used to estimate the hazard ratio (HR) between FPG and risk of infection. Results During a median follow-up of 8.1 years, the incidence rate of hospitalization for any infection was 36.33 and 14.26 per 1000 person-years among diabetics and nondiabetics, respectively, in the total study population, but increased to 70.02 and 45.21 per 1000 person-years, respectively, in the elderly. In the Cox regression analysis, the adjusted HR comparing diabetics to nondiabetics was 1.59 (95% confidence interval [CI], 1.52–1.67) for any hospitalization for infection and 1.71 (95% CI, 1.36–2.16) for infection-related mortality. The hazard for infection morbidity and mortality was higher at both extremes (<90 and >200 mg/dl) of FPG. The excess risk associated with FPG ≤ 90 mg/dl was attenuated after controlling for multiple comorbidities. Conclusions Poor glycemic control (FPG > 200 mg/dl) was associated with a higher risk of infection-related morbidity and mortality, especially in the elderly population where the baseline infection risk was high. Few studies have longitudinally assessed the association between glycemic control and risk of infection mortality. In this study, a U-shaped relation between fasting plasma glucose level and infection-related outcome was observed. After controlling for multiple comorbidities, the increased risk among those with a low fasting glucose level were attenuated, suggesting that comorbidities may play a role in the excess risk associated with low fasting plasma glucose level, in particular among elderly diabetic patients.
Respiratory syncytial virus in pediatric influenza‐like illness cases in Lombardy, Northern Italy, during seven consecutive winter seasons (from 2014–2015 to 2020–2021)
Introduction Respiratory syncytial virus (RSV) is the major cause of lower respiratory tract illness in young children and can also cause influenza‐like illness (ILI). Here we investigated the epidemiological features of RSV infection in pediatric ILI cases in Lombardy (a region in Northern Italy accounting nearly 10 million inhabitants) from 2014–2015 to 2020–2021 winter seasons. Material and Methods Data for this study were retrieved and statistically analyzed from the database of virological influenza surveillance of the regional reference laboratory for Lombardy within the Italian influenza surveillance network (InfluNet). Results RSV accounted for nearly 19% of pediatric ILI with a risk of infection nearly two‐fold greater than that of individuals ≥15 years. RSV positivity rate increased to 28% considering 0–5 years old children. Although in children ≤5 years the risk of infection from influenza viruses resulted nearly two‐fold higher than the risk of RSV infection, the age group 4–6 months and 7–12 months showed a five‐fold greater risk of infection from RSV than from influenza. Children ≤5 years of age with pre‐existing underlying health conditions had a nearly five‐fold greater risk of getting RSV infection than otherwise healthy 0–5 years old children. RSV was identified in ILI cases <15 years of age in all considered winter seasons except in the 2020–2021 season. Discussion Sentinel surveillance of ILI allowed us to identify groups at higher risk of RSV and influenza infection and to define the start, duration, timing, and intensity of the RSV and influenza community circulation. This surveillance approach can be implemented to assess the RSV circulation and impact in a real‐time manner.
Airborne SARS-CoV2 virus exposure, interpersonal distance, face mask and perceived risk of infection
Participants judged the risk of an infection during a face to face conversation at different interpersonal distances from a SARS-CoV-2 infected person who wore a face mask or not, and in the same questionnaire answered questions about Corona related issues. Keeping a distance to an infected person serves as a protective measure against an infection. When an infected person moves closer, risk of infection increases. Participants were aware of this fact, but underestimated the rate at which the risk of infection increases when getting closer to an infected person, e.g., from 1.5 to 0.5 m (perceived risk increase = 3.33 times higher, objective = 9.00 times higher). This is alarming because it means that people can take risks of infection that they are not aware of or want to take, when they approach another possibly virus infected person. Correspondingly, when an infected person moves away the speed of risk decrease was underestimated, meaning that people are not aware of how much safer they will be if they move away from an infected person. The perceived risk reducing effects of a face mask were approximately correct. Judgments of infection risk at different interpersonal distances (with or without a mask) were unrelated to how often a person used a mask, avoided others or canceled meetings during the COVID-19 pandemic. Greater worry in general and in particular over COVID-19, correlated positively with more protective behavior during the pandemic, but not with judgments of infection risk at different interpersonal distances. Participants with higher scores on a cognitive numeracy test judged mask efficiency more correctly, and women were more worried and risk avoiding than men. The results have implications for understanding behavior in a pandemic, and are relevant for risk communications about the steep increase in risk when approaching a person who may be infected with an airborne virus.
Acute effects of concurrent muscle power and sport-specific endurance exercises on markers of immunological stress response and measures of muscular fitness in highly trained youth male athletes
PurposeTo examine the acute effects of concurrent muscle power and sport-specific endurance exercises order on immunological stress responses, muscular-fitness, and rating-of-perceived-exertion (RPE) in highly trained youth male judo athletes.MethodsTwenty male participants randomly performed two concurrent training (CT) sessions; power-endurance and endurance-power. Measures of immune response (e.g., white blood cells), muscular-fitness (i.e., counter-movement-jump [CMJ]), RPE, blood-lactate, and -glucose were taken at different time-point (i.e., pre, mid, post, and post6h).ResultsThere were significant time*order interactions for white blood cells, lymphocytes, granulocytes, granulocyte-lymphocyte-ratio, and systemic-inflammation-index. Power-endurance resulted in significantly larger pre-to-post increases in white blood cells and lymphocytes while endurance-power resulted in significantly larger pre-to-post increases in the granulocyte-lymphocyte-ratio and systemic-inflammation-index. Likewise, significantly larger pre-to-post6h white blood cells and granulocytes increases were observed following power-endurance compared to endurance-power. Moreover, there was a significant time*order interaction for blood-glucose and -lactate. Following endurance-power, blood-lactate and -glucose increased from pre-to-mid but not from pre-to-post. Meanwhile, in power-endurance blood-lactate and -glucose increased from pre-to-post but not from pre-to-mid. A significant time*order interaction was observed for CMJ-force with larger pre-to-post decreases in endurance-power compared to power-endurance. Further, CMJ-power showed larger pre-to-mid performance decreases following power-endurance, compared to endurance-power. Regarding RPE, significant time*order interactions were noted with larger pre-to-mid values following endurance-power and larger pre-to-post values following power-endurance.ConclusionCT induced acute and delayed order-dependent immune cell count alterations in highly trained youth male judo athletes. In general, power-endurance induced higher acute and delayed immunological stress responses compared to endurance-power. CMJ-force and RPE fluctuated during both CT sessions but went back to baseline 6 h post-exercise.