Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
285 result(s) for "Smith, R.M"
Sort by:
Antimicrobial Resistance as Risk Factor for Recurrent Bacteremia after Staphylococcus aureus , Escherichia coli , or Klebsiella spp. Community-Onset Bacteremia
We investigated links between antimicrobial resistance in community-onset bacteremia and 1-year bacteremia recurrence by using the clinical data warehouse of Europe's largest university hospital group in France. We included adult patients hospitalized with an incident community-onset Staphylococcus aureus, Escherichia coli, or Klebsiella spp. bacteremia during 2017-2019. We assessed risk factors of 1-year recurrence using Fine-Gray regression models. Of the 3,617 patients included, 291 (8.0%) had >1 recurrence episode. Third-generation cephalosporin (3GC)-resistance was significantly associated with increased recurrence risk after incident Klebsiella spp. (hazard ratio 3.91 [95% CI 2.32-6.59]) or E. coli (hazard ratio 2.35 [95% CI 1.50-3.68]) bacteremia. Methicillin resistance in S. aureus bacteremia had no effect on recurrence risk. Although several underlying conditions and infection sources increased recurrence risk, 3GC-resistant Klebsiella spp. was associated with the greatest increase. These results demonstrate a new facet to illness induced by 3GC-resistant Klebsiella spp. and E. coli in the community setting.
Associations between lower limb muscle activation strategies and resultant multi-planar knee kinetics during single leg landings
Anterior cruciate ligament injury prevention programs purportedly improve knee joint loading through beneficial modification of lower limb neuromuscular control strategies and joint biomechanics, but little is known about how these factors relate during single-legged landings. Thus, we examined the relationship between explicit lower limb muscular pre-activity patterns and knee joint biomechanics elicited during such landings. Randomized controlled trial. Thirty-five female athletes had 3D knee joint biomechanics and lower limb EMG data recorded during a series of single-leg landings. Regression analysis assessed the relationship between pre-activity of vastus lateralis, lateral hamstring and rectus femoris with peak knee flexion angle and moment, and external anterior tibial shear force. Vastus lateralis, lateral hamstring and vastus lateralis:lateral hasmtring co-contraction assessed the relationship with knee abduction angle and moment. Greater pre-activity of rectus femoris predicted increased peak anterior tibial shear force (R2=0.235, b=2.41 and P=0.003) and reduced knee flexion moment (R2=0.131, b=−0.591, and P=0.032), while greater lateral hamstring predicted decreased peak knee flexion angle (R2=0.113, b=8.96 and P=0.048). No EMG pre-activity parameters were predictors (P>0.05) for knee abduction angle and moment. Current outcomes suggest reducing reliance on quadriceps activation may be beneficial during single-legged landings. It also, however, may be required for adequate joint stability during such maneuvers. Further research is needed to determine if inadequate hamstring activation, rather than elevated quadriceps activation, leads to hazardous loading during single-legged landings.
Platelet-specific SLFN14 deletion causes macrothrombocytopenia and platelet dysfunction through dysregulated megakaryocyte and platelet gene expression
Schlafen 14-related (SLFN14-related) thrombocytopenia is a rare bleeding disorder caused by SLFN14 mutations altering hemostasis in patients with platelet dysfunction. SLFN proteins are highly conserved in mammals where SLFN14 is specifically expressed in megakaryocyte (MK) and erythroblast lineages. The role of SLFN14 in megakaryopoiesis and platelet function has not been elucidated. Therefore, we generated a murine model with a platelet- and MK-specific SLFN14 deletion using platelet factor 4 (PF4) Cre-mediated deletion of exons 2 and 3 in Slfn14 (Slfn14 PF4-Cre) to decipher the molecular mechanisms driving the bleeding phenotype. Slfn14 PF4-Cre+ platelets displayed reduced platelet signaling to thrombin, reduced thrombin formation, increased bleeding tendency, and delayed thrombus formation as assessed by intravital imaging. Moreover, fewer in situ bone marrow MKs were present compared with controls. RNA-Seq and Gene Ontology analysis of MKs and platelets from Slfn14 PF4-Cre homozygous mice revealed altered pathways of ubiquitination, adenosine triphosphate activity, and cytoskeleton and molecular function. In summary, we investigated how SLFN14 deletion in MKs and platelets leads to platelet dysfunction and alters their transcriptome, explaining the platelet dysfunction and bleeding in humans and mice with SLFN14 mutations.
Efficacy and stability performance of traditional versus motion sensor-assisted strategies for FES standing
Standing by means of functional electrical stimulation (FES) after spinal cord injury is a topic widely reported in the neurorehabilitation literature. This practice commonly uses surface stimulation over the quadriceps muscle to evoke knee extension. To date, most FES neuroprostheses still operate without any artificial feedback, meaning that after a fatigue-driven knee buckle event, the stimulation amplitude or pulse width must be increased manually via button presses to re-establish knee-lock. This is often referred to as ‘hand-controlled (HC) operation’. In an attempt to provide a safer, yet clinically practical approach, this study proposed two novel strategies to automate the control of knee extension based on the kinematic feedback of four miniaturised motion sensors. These strategies were compared to the traditional HC strategy on four individuals with complete paraplegia. The standing times observed over multiple trials were in general longer for the automated strategies when compared to HC (0.5–80%). With the automated strategies, three of the subjects tended to need less upper body support over a frame to maintain balance. A stability analysis based on centre of pressure (CoP) measurements also favoured the automated strategies. This analysis also revealed that although FES standing with the assistance of a frame was likely to be safe for the subjects, their stability was still inferior to that of able-bodied individuals. Overall, the unpredictability of knee buckle events could be more effectively controlled by automated FES strategies to re-establish knee-lock when compared to the traditional user-controlled approach, thus demonstrating the safety and clinical efficacy of an automated approach.
Soil warming alters nitrogen cycling in a New England forest: implications for ecosystem function and structure
Global climate change is expected to affect terrestrial ecosystems in a variety of ways. Some of the more well-studied effects include the biogeochemical feedbacks to the climate system that can either increase or decrease the atmospheric load of greenhouse gases such as carbon dioxide and nitrous oxide. Less well-studied are the effects of climate change on the linkages between soil and plant processes. Here, we report the effects of soil warming on these linkages observed in a large field manipulation of a deciduous forest in southern New England, USA, where soil was continuously warmed 5°C above ambient for 7 years. Over this period, we have observed significant changes to the nitrogen cycle that have the potential to affect tree species composition in the long term. Since the start of the experiment, we have documented a 45% average annual increase in net nitrogen mineralization and a three-fold increase in nitrification such that in years 5 through 7, 25% of the nitrogen mineralized is then nitrified. The warming-induced increase of available nitrogen resulted in increases in the foliar nitrogen content and the relative growth rate of trees in the warmed area.Acer rubrum (red maple) trees have responded the most after 7 years of warming, with the greatest increases in both foliar nitrogen content and relative growth rates. Our study suggests that considering species-specific responses to increases in nitrogen availability and changes in nitrogen form is important in predicting future forest composition and feedbacks to the climate system.
“Sculpting George Foreman”: A Soul Era Champion in the Golden Age of Black Heavyweights
The most consistent aspect of George Foreman’s life has been his willingness to change. Yet with regard to Foreman’s early career, scholars have fixated only on snapshots of the flag-waving gold medalist at the 1968 Olympics, the surly heavyweight champion at the “Rumble in the Jungle” in 1974, or the gregarious pitchman for an eponymous kitchen appliance in the 1990s. These images, however, were not as important to the history of prize fighting as his process of transition in the early 1970s. Borrowing heavily from Soul Era popular culture to reinvent his public image allowed Foreman to interject himself into the sport’s greatest rivalry between Joe Frazier and Muhammad Ali and initiate a series of mega-matches in exotic locales that ultimately became the hallmark of this “Golden Age” for black heavyweight boxers.
Collateral impacts of pandemic COVID-19 drive the nosocomial spread of antibiotic resistance: A modelling study
Circulation of multidrug-resistant bacteria (MRB) in healthcare facilities is a major public health problem. These settings have been greatly impacted by the Coronavirus Disease 2019 (COVID-19) pandemic, notably due to surges in COVID-19 caseloads and the implementation of infection control measures. We sought to evaluate how such collateral impacts of COVID-19 impacted the nosocomial spread of MRB in an early pandemic context. We developed a mathematical model in which Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) and MRB cocirculate among patients and staff in a theoretical hospital population. Responses to COVID-19 were captured mechanistically via a range of parameters that reflect impacts of SARS-CoV-2 outbreaks on factors relevant for pathogen transmission. COVID-19 responses include both \"policy responses\" willingly enacted to limit SARS-CoV-2 transmission (e.g., universal masking, patient lockdown, and reinforced hand hygiene) and \"caseload responses\" unwillingly resulting from surges in COVID-19 caseloads (e.g., abandonment of antibiotic stewardship, disorganization of infection control programmes, and extended length of stay for COVID-19 patients). We conducted 2 main sets of model simulations, in which we quantified impacts of SARS-CoV-2 outbreaks on MRB colonization incidence and antibiotic resistance rates (the share of colonization due to antibiotic-resistant versus antibiotic-sensitive strains). The first set of simulations represents diverse MRB and nosocomial environments, accounting for high levels of heterogeneity across bacterial parameters (e.g., rates of transmission, antibiotic sensitivity, and colonization prevalence among newly admitted patients) and hospital parameters (e.g., rates of interindividual contact, antibiotic exposure, and patient admission/discharge). On average, COVID-19 control policies coincided with MRB prevention, including 28.2% [95% uncertainty interval: 2.5%, 60.2%] fewer incident cases of patient MRB colonization. Conversely, surges in COVID-19 caseloads favoured MRB transmission, resulting in a 13.8% [-3.5%, 77.0%] increase in colonization incidence and a 10.4% [0.2%, 46.9%] increase in antibiotic resistance rates in the absence of concomitant COVID-19 control policies. When COVID-19 policy responses and caseload responses were combined, MRB colonization incidence decreased by 24.2% [-7.8%, 59.3%], while resistance rates increased by 2.9% [-5.4%, 23.2%]. Impacts of COVID-19 responses varied across patients and staff and their respective routes of pathogen acquisition. The second set of simulations was tailored to specific hospital wards and nosocomial bacteria (methicillin-resistant Staphylococcus aureus, extended-spectrum beta-lactamase producing Escherichia coli). Consequences of nosocomial SARS-CoV-2 outbreaks were found to be highly context specific, with impacts depending on the specific ward and bacteria evaluated. In particular, SARS-CoV-2 outbreaks significantly impacted patient MRB colonization only in settings with high underlying risk of bacterial transmission. Yet across settings and species, antibiotic resistance burden was reduced in facilities with timelier implementation of effective COVID-19 control policies. Our model suggests that surges in nosocomial SARS-CoV-2 transmission generate selection for the spread of antibiotic-resistant bacteria. Timely implementation of efficient COVID-19 control measures thus has 2-fold benefits, preventing the transmission of both SARS-CoV-2 and MRB, and highlighting antibiotic resistance control as a collateral benefit of pandemic preparedness.
Optimizing COVID-19 surveillance in long-term care facilities: a modelling study
Background Long-term care facilities (LTCFs) are vulnerable to outbreaks of coronavirus disease 2019 (COVID-19). Timely epidemiological surveillance is essential for outbreak response, but is complicated by a high proportion of silent (non-symptomatic) infections and limited testing resources. Methods We used a stochastic, individual-based model to simulate transmission of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) along detailed inter-individual contact networks describing patient-staff interactions in a real LTCF setting. We simulated distribution of nasopharyngeal swabs and reverse transcriptase polymerase chain reaction (RT-PCR) tests using clinical and demographic indications and evaluated the efficacy and resource-efficiency of a range of surveillance strategies, including group testing (sample pooling) and testing cascades, which couple (i) testing for multiple indications (symptoms, admission) with (ii) random daily testing. Results In the baseline scenario, randomly introducing a silent SARS-CoV-2 infection into a 170-bed LTCF led to large outbreaks, with a cumulative 86 (95% uncertainty interval 6–224) infections after 3 weeks of unmitigated transmission. Efficacy of symptom-based screening was limited by lags to symptom onset and silent asymptomatic and pre-symptomatic transmission. Across scenarios, testing upon admission detected just 34–66% of patients infected upon LTCF entry, and also missed potential introductions from staff. Random daily testing was more effective when targeting patients than staff, but was overall an inefficient use of limited resources. At high testing capacity (> 10 tests/100 beds/day), cascades were most effective, with a 19–36% probability of detecting outbreaks prior to any nosocomial transmission, and 26–46% prior to first onset of COVID-19 symptoms. Conversely, at low capacity (< 2 tests/100 beds/day), group testing strategies detected outbreaks earliest. Pooling randomly selected patients in a daily group test was most likely to detect outbreaks prior to first symptom onset (16–27%), while pooling patients and staff expressing any COVID-like symptoms was the most efficient means to improve surveillance given resource limitations, compared to the reference requiring only 6–9 additional tests and 11–28 additional swabs to detect outbreaks 1–6 days earlier, prior to an additional 11–22 infections. Conclusions COVID-19 surveillance is challenged by delayed or absent clinical symptoms and imperfect diagnostic sensitivity of standard RT-PCR tests. In our analysis, group testing was the most effective and efficient COVID-19 surveillance strategy for resource-limited LTCFs. Testing cascades were even more effective given ample testing resources. Increasing testing capacity and updating surveillance protocols accordingly could facilitate earlier detection of emerging outbreaks, informing a need for urgent intervention in settings with ongoing nosocomial transmission.
Rapid antigen testing as a reactive response to surges in nosocomial SARS-CoV-2 outbreak risk
Healthcare facilities are vulnerable to SARS-CoV-2 introductions and subsequent nosocomial outbreaks. Antigen rapid diagnostic testing (Ag-RDT) is widely used for population screening, but its health and economic benefits as a reactive response to local surges in outbreak risk are unclear. We simulate SARS-CoV-2 transmission in a long-term care hospital with varying COVID-19 containment measures in place (social distancing, face masks, vaccination). Across scenarios, nosocomial incidence is reduced by up to 40-47% (range of means) with routine symptomatic RT-PCR testing, 59-63% with the addition of a timely round of Ag-RDT screening, and 69-75% with well-timed two-round screening. For the latter, a delay of 4-5 days between the two screening rounds is optimal for transmission prevention. Screening efficacy varies depending on test sensitivity, test type, subpopulations targeted, and community incidence. Efficiency, however, varies primarily depending on underlying outbreak risk, with health-economic benefits scaling by orders of magnitude depending on the COVID-19 containment measures in place. Healthcare facilities are vulnerable to SARS-CoV-2 introductions and subsequent nosocomial outbreaks. Here, the authors simulate transmission in a long-term care facility with varying containment measures in place and evaluate reactive response with antigen rapid diagnostic testing.