Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Language
      Language
      Clear All
      Language
  • Subject
      Subject
      Clear All
      Subject
  • Item Type
      Item Type
      Clear All
      Item Type
  • Discipline
      Discipline
      Clear All
      Discipline
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
399 result(s) for "Bonten, Marc J. M."
Sort by:
Selective decontamination of the digestive tract (SDD) in critically ill patients: a narrative review
Selective decontamination of the digestive tract (SDD) is an infection prevention measure for intensive care unit (ICU) patients that was proposed more than 30 years ago, and that is currently considered standard of care in the Netherlands, but only used sporadically in ICUs in other countries. In this narrative review, we first describe the rationale of the individual components of SDD and then review the evidence base for patient-centered outcomes, where we distinguish ICUs with low prevalence of antibiotic resistance from ICUs with moderate–high prevalence of resistance. In settings with low prevalence of antibiotic resistance, SDD has been associated with improved patient outcome in three cluster-randomized studies. These benefits were not confirmed in a large international cluster-randomized study in settings with moderate-to-high prevalence of antibiotic resistance. There is no evidence that SDD increases antibiotic resistance. We end with future directions for research.
Dissemination of Cephalosporin Resistance Genes between Escherichia coli Strains from Farm Animals and Humans by Specific Plasmid Lineages
Third-generation cephalosporins are a class of β-lactam antibiotics that are often used for the treatment of human infections caused by Gram-negative bacteria, especially Escherichia coli. Worryingly, the incidence of human infections caused by third-generation cephalosporin-resistant E. coli is increasing worldwide. Recent studies have suggested that these E. coli strains, and their antibiotic resistance genes, can spread from food-producing animals, via the food-chain, to humans. However, these studies used traditional typing methods, which may not have provided sufficient resolution to reliably assess the relatedness of these strains. We therefore used whole-genome sequencing (WGS) to study the relatedness of cephalosporin-resistant E. coli from humans, chicken meat, poultry and pigs. One strain collection included pairs of human and poultry-associated strains that had previously been considered to be identical based on Multi-Locus Sequence Typing, plasmid typing and antibiotic resistance gene sequencing. The second collection included isolates from farmers and their pigs. WGS analysis revealed considerable heterogeneity between human and poultry-associated isolates. The most closely related pairs of strains from both sources carried 1263 Single-Nucleotide Polymorphisms (SNPs) per Mbp core genome. In contrast, epidemiologically linked strains from humans and pigs differed by only 1.8 SNPs per Mbp core genome. WGS-based plasmid reconstructions revealed three distinct plasmid lineages (IncI1- and IncK-type) that carried cephalosporin resistance genes of the Extended-Spectrum Beta-Lactamase (ESBL)- and AmpC-types. The plasmid backbones within each lineage were virtually identical and were shared by genetically unrelated human and animal isolates. Plasmid reconstructions from short-read sequencing data were validated by long-read DNA sequencing for two strains. Our findings failed to demonstrate evidence for recent clonal transmission of cephalosporin-resistant E. coli strains from poultry to humans, as has been suggested based on traditional, low-resolution typing methods. Instead, our data suggest that cephalosporin resistance genes are mainly disseminated in animals and humans via distinct plasmids.
Direct Matrix-Assisted Laser Desorption Ionization Time-of-Flight Mass Spectrometry Improves Appropriateness of Antibiotic Treatment of Bacteremia
Matrix assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS) allows the identification of microorganisms directly from positive blood culture broths. Use of the MALDI-TOF MS for rapid identification of microorganisms from blood culture broths can reduce the turnaround time to identification and may lead to earlier appropriate treatment of bacteremia. During February and April 2010, direct MALDI-TOF MS was routinely performed on all positive blood cultures. During December 2009 and March 2010 no direct MALDI-TOF MS was used. Information on antibiotic therapy was collected from the hospital and intensive care units' information systems from all positive blood cultures during the study period. In total, 253 episodes of bacteremia were included of which 89 during the intervention period and 164 during the control period. Direct performance of MALDI-TOF MS on positive blood culture broths reduced the time till species identification by 28.8-h and was associated with an 11.3% increase in the proportion of patients receiving appropriate antibiotic treatment 24 hours after blood culture positivity (64.0% in the control period versus 75.3% in the intervention period (p0.01)). Routine implementation of this technique increased the proportion of patients on adequate antimicrobial treatment within 24 hours.
Model-based evaluation of school- and non-school-related measures to control the COVID-19 pandemic
The role of school-based contacts in the epidemiology of SARS-CoV-2 is incompletely understood. We use an age-structured transmission model fitted to age-specific seroprevalence and hospital admission data to assess the effects of school-based measures at different time points during the COVID-19 pandemic in the Netherlands. Our analyses suggest that the impact of measures reducing school-based contacts depends on the remaining opportunities to reduce non-school-based contacts. If opportunities to reduce the effective reproduction number ( R e ) with non-school-based measures are exhausted or undesired and R e is still close to 1, the additional benefit of school-based measures may be considerable, particularly among older school children. As two examples, we demonstrate that keeping schools closed after the summer holidays in 2020, in the absence of other measures, would not have prevented the second pandemic wave in autumn 2020 but closing schools in November 2020 could have reduced R e below 1, with unchanged non-school-based contacts. The role of school-based contacts in the epidemiology of SARS-CoV-2 is incompletely understood. Here, the authors use an age-structured transmission model fitted to age-specific seroprevalence and hospital admission data to assess the effects of school-based measures during the COVID-19 pandemic in the Netherlands.
Isoniazid Prophylactic Therapy for the Prevention of Tuberculosis in HIV Infected Adults: A Systematic Review and Meta-Analysis of Randomized Trials
Infection with Human Immunodeficiency virus (HIV) is an important risk factor for Tuberculosis (TB). Anti-Retroviral Therapy (ART) has improved the prognosis of HIV and reduced the risk of TB infected patients. Isoniazid Preventive Therapy (IPT) aims to reduce the development of active TB in patients with latent TB. Systematically review and synthesize effect estimates of IPT for TB prevention in adult HIV patients. Secondary objectives were to assess the effect of IPT on HIV disease progression, all-cause mortality and adverse drug reaction (ADR). Electronic databases were searched to identify relevant articles in English available by September 11th 2015. Research articles comparing IPT to placebo or no treatment in HIV infected adults using randomized clinical trials. A qualitative review included study-level information on randomization and treatment allocation. Effect estimates were pooled using random-effects models to account for between-study heterogeneity. This review assessed ten randomized clinical trials that assigned 7619 HIV patients to IPT or placebo. An overall 35% of TB risk reduction (RR = 0.65, 95% CI (0.51, 0.84)) was found in all participants, however, larger benefit of IPT was observed in Tuberculin Skin Test (TST) positive participants, with pooled relative risk reduction of 52% [RR = 0.48; 95% CI (0.29, 0.82)] and with a prediction interval ranging from 0.13 to 1.81. There was no statistically significant effect of IPT on TB occurrence in TST negative or unknown participants. IPT also reduced the risk of HIV disease progression in all participants (RR = 0.69; 95% CI (0.48, 0.99)) despite no benefits observed in TST strata. All-cause mortality was not affected by IPT although participants who had 12 months of IPT tend to have a reduced risk (RR = 0.65; 95% CI(0.47, 0.90)). IPT had an elevated, yet statistically non-significant, risk of adverse drug reaction [RR = 1.20; 95% CI (1.20, 1.71)]. Only a single study assessed the effect of IPT in combination with ART in preventing TB and occurrence of multi-drug resistant tuberculosis. IPT use substantially contributes in preventing TB in persons with HIV in general and in TST positive individuals in particular. More evidence is needed to explain discrepancies in the protective effect of IPT in these individuals.
Risk factors for ESBL-producing Escherichia coli on pig farms: A longitudinal study in the context of reduced use of antimicrobials
The presence of extended-spectrum beta-lactamase-producing Escherichia coli (ESBL-E. coli) in food animals is a public health concern. This study aimed to determine prevalence of ESBL-E. coli on pig farms and to assess the effect of reducing veterinary antimicrobial use (AMU) and farm management practices on ESBL-E. coli occurrence on pig farms. During 2011-2013, 36 Dutch conventional pig farms participated in a longitudinal study (4 sampling times in 18 months). Rectal swabs were taken from 60 pigs per farm and pooled per 6 pigs within the same age category. Presence of ESBL-E. coli was determined by selective plating and ESBL genes were characterized by microarray, PCR and gene sequencing. An extensive questionnaire on farm characteristics and AMU as Defined Daily Dosages per Animal Year (DDDA/Y) was available for the 6-month periods before each sampling moment. Associations between the presence of ESBL-E. coli-positive pigs and farm management practices were modelled with logistic regression. The number of farms with ESBL-E. coli carrying pigs decreased from 16 to 10 and the prevalence of ESBL-E. coli-positive pooled pig samples halved from 27% to 13%. Overall, the most detected ESBL genes were blaCTX-M-1, blaTEM-52 and blaCTX-M-14. The presence of ESBL-E. coli carrying pigs was not related to total AMU, but it was strongly determined by the presence or absence of cephalosporin use at the farm (OR = 46.4, p = 0.006). Other farm management factors, related with improved biosecurity, were also plausibly related to lower probabilities for ESBL-E. coli-positive farms (e.g. presence of a hygiene lock, pest control delivered by a professional). In conclusion, ESBL-E. coli prevalence decreased in pigs during 2011 and 2013 in the Netherlands. On pig farms, the use of cephalosporins was associated with the presence of ESBL-E. coli carrying pigs.
Characteristics and determinants of outcome of hospital-acquired bloodstream infections in intensive care units: the EUROBACT International Cohort Study
Purpose The recent increase in drug-resistant micro-organisms complicates the management of hospital-acquired bloodstream infections (HA-BSIs). We investigated the epidemiology of HA-BSI and evaluated the impact of drug resistance on outcomes of critically ill patients, controlling for patient characteristics and infection management. Methods A prospective, multicentre non-representative cohort study was conducted in 162 intensive care units (ICUs) in 24 countries. Results We included 1,156 patients [mean ± standard deviation (SD) age, 59.5 ± 17.7 years; 65 % males; mean ± SD Simplified Acute Physiology Score (SAPS) II score, 50 ± 17] with HA-BSIs, of which 76 % were ICU-acquired. Median time to diagnosis was 14 [interquartile range (IQR), 7–26] days after hospital admission. Polymicrobial infections accounted for 12 % of cases. Among monomicrobial infections, 58.3 % were gram-negative, 32.8 % gram-positive, 7.8 % fungal and 1.2 % due to strict anaerobes. Overall, 629 (47.8 %) isolates were multidrug-resistant (MDR), including 270 (20.5 %) extensively resistant (XDR), and 5 (0.4 %) pan-drug-resistant (PDR). Micro-organism distribution and MDR occurrence varied significantly ( p  < 0.001) by country. The 28-day all-cause fatality rate was 36 %. In the multivariable model including micro-organism, patient and centre variables, independent predictors of 28-day mortality included MDR isolate [odds ratio (OR), 1.49; 95 % confidence interval (95 %CI), 1.07–2.06], uncontrolled infection source (OR, 5.86; 95 %CI, 2.5–13.9) and timing to adequate treatment (before day 6 since blood culture collection versus never, OR, 0.38; 95 %CI, 0.23–0.63; since day 6 versus never, OR, 0.20; 95 %CI, 0.08–0.47). Conclusions MDR and XDR bacteria (especially gram-negative) are common in HA-BSIs in critically ill patients and are associated with increased 28-day mortality. Intensified efforts to prevent HA-BSIs and to optimize their management through adequate source control and antibiotic therapy are needed to improve outcomes.
The impact of community-acquired pneumonia on the health-related quality-of-life in elderly
Background The sustained health-related quality-of-life of patients surviving community-acquired pneumonia has not been accurately quantified. The aim of the current study was to quantify differences in health-related quality-of-life of community-dwelling elderly with and without community-acquired pneumonia during a 12-month follow-up period. Methods In a matched cohort study design, nested in a prospective randomized double-blind placebo-controlled trial on the efficacy of the 13-valent pneumococcal vaccine in community-dwelling persons of ≥65 years, health-related quality-of-life was assessed in 562 subjects hospitalized with suspected community-acquired pneumonia (i.e. diseased cohort) and 1145 unaffected persons (i.e. non-diseased cohort) matched to pneumonia cases on age, sex, and health status (EQ-5D-3L-index). Health-related quality-of-life was determined 1–2 weeks after hospital discharge/inclusion and 1, 6 and 12 months thereafter, using Euroqol EQ-5D-3L and Short Form-36 Health survey questionnaires. One-year quality-adjusted life years (QALY) were estimated for both diseased and non-diseased cohorts. Separate analyses were performed for pneumonia cases with and without radiologically confirmed community-acquired pneumonia. Results The one-year excess QALY loss attributed to community-acquired pneumonia was 0.13. Mortality in the post-discharge follow-up year was 8.4% in community-acquired pneumonia patients and 1.2% in non-diseased persons ( p  < 0.001). During follow-up health-related quality-of-life was persistently lower in community-acquired pneumonia patients, compared to non-diseased persons, but differences in health-related quality-of-life between radiologically confirmed and non-confirmed community-acquired pneumonia cases were not statistically significant. Conclusions Community-acquired pneumonia was associated with a six-fold increased mortality and 16% lower quality-of-life in the post-discharge year among patients surviving hospitalization for community-acquired pneumonia, compared to non-diseased persons. Trial registration ClinicalTrials.gov, NCT00812084 .
Electronic Implementation of a Novel Surveillance Paradigm for Ventilator-associated Events. Feasibility and Validation
Accurate surveillance of ventilator-associated pneumonia (VAP) is hampered by subjective diagnostic criteria. A novel surveillance paradigm for ventilator-associated events (VAEs) was introduced. To determine the validity of surveillance using the new VAE algorithm. Prospective cohort study in two Dutch academic medical centers (2011-2012). VAE surveillance was electronically implemented and included assessment of (infection-related) ventilator-associated conditions (VAC, IVAC) and VAP. Concordance with ongoing prospective VAP surveillance was assessed, along with clinical diagnoses underlying VAEs and associated mortality of all conditions. Consequences of minor differences in electronic VAE implementation were evaluated. The study included 2,080 patients with 2,296 admissions. Incidences of VAC, IVAC, VAE-VAP, and VAP according to prospective surveillance were 10.0, 4.2, 3.2, and 8.0 per 1000 ventilation days, respectively. The VAE algorithm detected at most 32% of the patients with VAP identified by prospective surveillance. VAC signals were most often caused by volume overload and infections, but not necessarily VAP. Subdistribution hazards for mortality were 3.9 (95% confidence interval, 2.9-5.3) for VAC, 2.5 (1.5-4.1) for IVAC, 2.0 (1.1-3.6) for VAE-VAP, and 7.2 (5.1-10.3) for VAP identified by prospective surveillance. In sensitivity analyses, mortality estimates varied considerably after minor differences in electronic algorithm implementation. Concordance between the novel VAE algorithm and VAP was poor. Incidence and associated mortality of VAE were susceptible to small differences in electronic implementation. More studies are needed to characterize the clinical entities underlying VAE and to ensure comparability of rates from different institutions.
Classification of sepsis, severe sepsis and septic shock: the impact of minor variations in data capture and definition of SIRS criteria
Purpose To quantify the effects of minor variations in the definition and measurement of systemic inflammatory response syndrome (SIRS) criteria and organ failure on the observed incidences of sepsis, severe sepsis and septic shock. Methods We conducted a prospective, observational study in a tertiary intensive care unit in The Netherlands between January 2009 and October 2010. A total of 1,072 consecutive adults were included. We determined the upper and lower limits of the measured incidence of sepsis by evaluating the influence of the use of an automated versus a manual method of data collection, and variations in the number of SIRS criteria, concurrency of SIRS criteria, and duration of abnormal values required to make a particular diagnosis. Results The measured incidence of SIRS varied from 49 % (most restrictive setting) to 99 % (most liberal setting). Subsequently, the incidences of sepsis, severe sepsis and septic shock ranged from 22 to 31 %, from 6 to 27 % and from 4 to 9 % for the most restrictive versus the most liberal measurement settings, respectively. In non-infected patients, 39–98 % of patients had SIRS, whereas still 17–6 % of patients without SIRS had an infection. Conclusions The apparent incidence of sepsis heavily depends on minor variations in the definition of SIRS and mode of data recording. As a consequence, the current consensus criteria do not ensure uniform recruitment of patients into sepsis trials.