Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
14 result(s) for "Butcher, Brad"
Sort by:
Key mechanisms by which post-ICU activities can improve in-ICU care: results of the international THRIVE collaboratives
ObjectiveTo identify the key mechanisms that clinicians perceive improve care in the intensive care unit (ICU), as a result of their involvement in post-ICU programs.MethodsQualitative inquiry via focus groups and interviews with members of the Society of Critical Care Medicine’s THRIVE collaborative sites (follow-up clinics and peer support). Framework analysis was used to synthesize and interpret the data.ResultsFive key mechanisms were identified as drivers of improvement back into the ICU: (1) identifying otherwise unseen targets for ICU quality improvement or education programs—new ideas for quality improvement were generated and greater attention paid to detail in clinical care. (2) Creating a new role for survivors in the ICU—former patients and family members adopted an advocacy or peer volunteer role. (3) Inviting critical care providers to the post-ICU program to educate, sensitize, and motivate them—clinician peers and trainees were invited to attend as a helpful learning strategy to gain insights into post-ICU care requirements. (4) Changing clinician’s own understanding of patient experience—there appeared to be a direct individual benefit from working in post-ICU programs. (5) Improving morale and meaningfulness of ICU work—this was achieved by closing the feedback loop to ICU clinicians regarding patient and family outcomes.ConclusionsThe follow-up of patients and families in post-ICU care settings is perceived to improve care within the ICU via five key mechanisms. Further research is required in this novel area.
Examining the needs of survivors of critical illness through the lens of palliative care: A qualitative study of survivor experiences
To examine the needs of adult survivors of critical illness through a lens of palliative care. A qualitative study of adult survivors of critical illness using semi-structured interviews and framework analysis. Participants were recruited from the post-intensive care unit clinic of a mid-Atlantic academic medical center in the United States. Seventeen survivors of critical illness aged 34–80 (median, 66) participated in the study. The majority of patients were female (64.7 %, n = 11) with a median length of index ICU stay of 12 days (interquartile range [IQR] 8–19). Interviews were conducted February to March 2021 and occurred a median of 20 months following the index intensive care stay (range, 13–33 months). We identified six key themes which align with palliative care principles: 1) persistent symptom burden; 2) critical illness as a life-altering experience; 3) spiritual changes and significance; 4) interpreting/managing the survivor experience; 5) feelings of loss and burden; and 6) social support needs. Our findings suggest that palliative care components such as symptom management, goals of care discussions, care coordination, and spiritual and social support may assist in the assessment and treatment of survivors of critical illness.
Modification of social determinants of health by critical illness and consequences of that modification for recovery: an international qualitative study
ObjectivesSocial determinants of health (SDoH) contribute to health outcomes. We identified SDoH that were modified by critical illness, and the effect of such modifications on recovery from critical illness.DesignIn-depth semistructured interviews following hospital discharge. Interview transcripts were mapped against a pre-existing social policy framework: money and work; skills and education; housing, transport and neighbourhoods; and family, friends and social connections.Setting14 hospital sites in the USA, UK and Australia.ParticipantsPatients and caregivers, who had been admitted to critical care from three continents.Results86 interviews were analysed (66 patients and 20 caregivers). SDoH, both financial and non-financial in nature, could be negatively influenced by exposure to critical illness, with a direct impact on health-related outcomes at an individual level. Financial modifications included changes to employment status due to critical illness-related disability, alongside changes to income and insurance status. Negative health impacts included the inability to access essential healthcare and an increase in mental health problems.ConclusionsCritical illness appears to modify SDoH for survivors and their family members, potentially impacting recovery and health. Our findings suggest that increased attention to issues such as one’s social network, economic security and access to healthcare is required following discharge from critical care.
Restrictive versus Liberal Rate of Extracorporeal Volume Removal Evaluation in Acute Kidney Injury (RELIEVE-AKI): a pilot clinical trial protocol
IntroductionObservational studies have linked slower and faster net ultrafiltration (UFNET) rates during kidney replacement therapy (KRT) with mortality in critically ill patients with acute kidney injury (AKI) and fluid overload. To inform the design of a larger randomised trial of patient-centered outcomes, we conduct a feasibility study to examine restrictive and liberal approaches to UFNET during continuous KRT (CKRT).Methods and analysisThis study is an investigator-initiated, unblinded, 2-arm, comparative-effectiveness, stepped-wedged, cluster randomised trial among 112 critically ill patients with AKI treated with CKRT in 10 intensive care units (ICUs) across 2 hospital systems. In the first 6 months, all ICUs started with a liberal UFNET rate strategy. Thereafter, one ICU is randomised to the restrictive UFNET rate strategy every 2 months. In the liberal group, the UFNET rate is maintained between 2.0 and 5.0 mL/kg/hour; in the restrictive group, the UFNET rate is maintained between 0.5 and 1.5 mL/kg/hour. The three coprimary feasibility outcomes are (1) between-group separation in mean delivered UFNET rates; (2) protocol adherence; and (3) patient recruitment rate. Secondary outcomes include daily and cumulative fluid balance, KRT and mechanical ventilation duration, organ failure-free days, ICU and hospital length of stay, hospital mortality and KRT dependence at hospital discharge. Safety endpoints include haemodynamics, electrolyte imbalance, CKRT circuit issues, organ dysfunction related to fluid overload, secondary infections and thrombotic and haematological complications.Ethics and disseminationThe University of Pittsburgh Human Research Protection Office approved the study, and an independent Data and Safety Monitoring Board monitors the study. A grant from the United States National Institute of Diabetes and Digestive and Kidney Diseases sponsors the study. The trial results will be submitted for publication in peer-reviewed journals and presented at scientific conferences.Trial registration numberThis trial has been prospectively registered with clinicaltrials.gov (NCT05306964). Protocol version identifier and date: 1.5; 13 June 2023.
Case 39-2009
A 28-year-old woman was admitted to this hospital 2 days post partum because of cardiac failure. Four weeks earlier, at approximately 29 weeks' gestation, retrosternal chest discomfort developed, followed by shortness of breath, myalgias, and fatigue. Echocardiography revealed a pericardial effusion, with tamponade and biventricular hypokinesis. Despite pericardiocentesis and cesarean section at 32 weeks, her condition worsened. Radiographs obtained at this hospital showed a widened mediastinum; echocardiography revealed thickening of the pericardium and abnormal material surrounding the great vessels. A diagnostic procedure was performed. A 28-year-old woman was admitted to the hospital 2 days post partum because of cardiac failure. At approximately 29 weeks' gestation, retrosternal chest discomfort had developed, followed by shortness of breath, myalgias, and fatigue. Presentation of Case Dr. Brad W. Butcher (Medicine): A 28-year-old woman was admitted to this hospital 2 days post partum because of cardiac failure. The patient had been well, except for gestational diabetes mellitus, until 4 weeks earlier (approximately 29 weeks' gestation), when retrosternal chest discomfort developed, followed by shortness of breath, myalgias, and fatigue. One week later, she went to the emergency department of another hospital. Laboratory-test results are shown in Table 1. Cephalexin and a proton-pump inhibitor were prescribed. Two days later, she returned to the hospital because of persistent symptoms. She had no fevers, night sweats, upper respiratory . . .
Immune competence traits assessed during the stress of weaning are heritable and favorably genetically correlated with temperament traits in Angus cattle
Selection for production traits with little or no emphasis on health-related traits has the potential to increase susceptibility to disease in food-producing animals. A possible genetic strategy to mitigate such effects is to include both production and health traits in the breeding objective when selecting animals. For this to occur, reliable methodologies are required to assess beneficial health traits, such as the immune capacity of animals. We describe here a methodology to assess the immune competence of beef cattle which is both practical to apply on farm and does not restrict the future sale of tested animals. The methodology also accommodates variation in prior vaccination history of cohorts of animals being tested. In the present study, the immune competence phenotype of 1,100 Angus calves was assessed during yard weaning. Genetic parameters associated with immune competence traits were estimated and associations between immune competence, temperament, and stress-coping ability traits were investigated. Results suggested that immune competence traits, related to an animal’s ability to mount both antibody and cell-mediated immune responses, are moderately heritable (h2 = 0.32 ± 0.09 and 0.27 ± 0.08, respectively) and favorably genetically correlated with the temperament trait, flight time (r = 0.63 ± 0.31 and 0.60 ± 0.29 with antibody and cell-mediated immune responses, respectively). Development of methodologies to assess the immune competence phenotype of beef cattle is a critical first step in the establishment of genetic selection strategies aimed at improving the general disease resistance of beef herds. Strategies aimed at reducing the incidence of disease in beef cattle are expected to significantly improve animal health and welfare, reduce reliance on the use of antibiotics to treat disease, and reduce disease-associated costs incurred by producers.
Associations between immune competence phenotype and feedlot health and productivity in Angus cattle
Abstract Genetic strategies aimed at improving general immune competence (IC) have the potential to reduce the incidence and severity of disease in beef production systems, with resulting benefits of improved animal health and welfare and reduced reliance on antibiotics to prevent and treat disease. Implementation of such strategies first requires that methodologies be developed to phenotype animals for IC and demonstration that these phenotypes are associated with health outcomes. We have developed a methodology to identify IC phenotypes in beef steers during the yard weaning period, which is both practical to apply on-farm and does not restrict the future sale of tested animals. In the current study, a total of 838 Angus steers, previously IC phenotyped at weaning, were categorized as low (n = 98), average (n = 653), or high (n = 88) for the IC phenotype. Detailed health and productivity data were collected on all steers during feedlot finishing, and associations between IC phenotype, health outcomes, and productivity were investigated. A favorable association between IC phenotype and number of mortalities during feedlot finishing was observed with higher mortalities recorded in low IC steers (6.1%) as compared with average (1.2%, P < 0.001) or high (0%, P = 0.018) IC steers. Disease incidence was numerically highest in low IC steers (15.3 cases/100 animals) and similar in average IC steers (10.1 cases/100 animals) and high IC steers (10.2 cases/100 animals); however, differences between groups were not significant. No significant influence of IC phenotype on average daily gain was observed, suggesting that selection for improved IC is unlikely to incur a significant penalty to production. The potential economic benefits of selecting for IC in the feedlot production environment were calculated. Health-associated costs were calculated as the sum of lost production costs, lost capital investment costs, and disease treatment costs. Based on these calculations, health-associated costs were estimated at AUS$103/head in low IC steers, AUS$25/head in average IC steers, and AUS$4/head in high IC steers, respectively. These findings suggest that selection for IC has the potential to reduce mortalities during feedlot finishing and, as a consequence, improve the health and welfare of cattle in the feedlot production environment and reduce health-associated costs incurred by feedlot operators.
Immune competence traits assessed during the stress of weaning are heritable and favorably genetically correlated with temperament traits in Angus cattle1
Abstract Selection for production traits with little or no emphasis on health-related traits has the potential to increase susceptibility to disease in food-producing animals. A possible genetic strategy to mitigate such effects is to include both production and health traits in the breeding objective when selecting animals. For this to occur, reliable methodologies are required to assess beneficial health traits, such as the immune capacity of animals. We describe here a methodology to assess the immune competence of beef cattle which is both practical to apply on farm and does not restrict the future sale of tested animals. The methodology also accommodates variation in prior vaccination history of cohorts of animals being tested. In the present study, the immune competence phenotype of 1,100 Angus calves was assessed during yard weaning. Genetic parameters associated with immune competence traits were estimated and associations between immune competence, temperament, and stress-coping ability traits were investigated. Results suggested that immune competence traits, related to an animal’s ability to mount both antibody and cell-mediated immune responses, are moderately heritable (h2 = 0.32 ± 0.09 and 0.27 ± 0.08, respectively) and favorably genetically correlated with the temperament trait, flight time (r = 0.63 ± 0.31 and 0.60 ± 0.29 with antibody and cell-mediated immune responses, respectively). Development of methodologies to assess the immune competence phenotype of beef cattle is a critical first step in the establishment of genetic selection strategies aimed at improving the general disease resistance of beef herds. Strategies aimed at reducing the incidence of disease in beef cattle are expected to significantly improve animal health and welfare, reduce reliance on the use of antibiotics to treat disease, and reduce disease-associated costs incurred by producers.
Immune competence traits assessed during the stress of weaning are heritable and favorably genetically correlated with temperament traits in Angus cattle 1
Selection for production traits with little or no emphasis on health-related traits has the potential to increase susceptibility to disease in food-producing animals. A possible genetic strategy to mitigate such effects is to include both production and health traits in the breeding objective when selecting animals. For this to occur, reliable methodologies are required to assess beneficial health traits, such as the immune capacity of animals. We describe here a methodology to assess the immune competence of beef cattle which is both practical to apply on farm and does not restrict the future sale of tested animals. The methodology also accommodates variation in prior vaccination history of cohorts of animals being tested. In the present study, the immune competence phenotype of 1,100 Angus calves was assessed during yard weaning. Genetic parameters associated with immune competence traits were estimated and associations between immune competence, temperament, and stress-coping ability traits were investigated. Results suggested that immune competence traits, related to an animal's ability to mount both antibody and cell-mediated immune responses, are moderately heritable (h2 = 0.32 ± 0.09 and 0.27 ± 0.08, respectively) and favorably genetically correlated with the temperament trait, flight time (r = 0.63 ± 0.31 and 0.60 ± 0.29 with antibody and cell-mediated immune responses, respectively). Development of methodologies to assess the immune competence phenotype of beef cattle is a critical first step in the establishment of genetic selection strategies aimed at improving the general disease resistance of beef herds. Strategies aimed at reducing the incidence of disease in beef cattle are expected to significantly improve animal health and welfare, reduce reliance on the use of antibiotics to treat disease, and reduce disease-associated costs incurred by producers.
Assessment of Discrepancies Between Follow-up Infarct Volume and 90-Day Outcomes Among Patients With Ischemic Stroke Who Received Endovascular Therapy
Some patients have poor outcomes despite small infarcts after endovascular therapy (EVT), while others with large infarcts do well. Understanding why these discrepancies occur may help to optimize EVT outcomes. To validate exploratory findings from the Endovascular Treatment for Small Core and Anterior Circulation Proximal Occlusion with Emphasis on Minimizing CT to Recanalization Times (ESCAPE) trial regarding pretreatment, treatment-related, and posttreatment factors associated with discrepancies between follow-up infarct volume (FIV) and 90-day functional outcome. This cohort study is a post hoc analysis of the Safety and Efficacy of Nerinetide in Subjects Undergoing Endovascular Thrombectomy for Stroke (ESCAPE-NA1) trial, a double-blind, randomized, placebo-controlled, international, multicenter trial conducted from March 2017 to August 2019. Patients who participated in ESCAPE-NA1 and had available 90-day modified Rankin Scale (mRS) scores and 24-hour to 48-hour posttreatment follow-up parenchymal imaging were included. Small FIV (volume ≤25th percentile) and large FIV (volume ≥75th percentile) on 24-hour computed tomography/magnetic resonance imaging. Baseline factors, outcomes, treatments, and poststroke serious adverse events (SAEs) were compared between discrepant cases (ie, patients with 90-day mRS score ≥3 despite small FIV or those with mRS scores ≤2 despite large FIV) and nondiscrepant cases. Area under the curve (AUC) and goodness of fit of prespecified logistic models, including pretreatment (eg, age, cancer, vascular risk factors) and treatment-related and posttreatment (eg, SAEs) factors, were compared with stepwise regression-derived models for ability to identify small FIV with higher mRS score and large FIV with lower mRS score. Among 1091 patients (median [IQR] age, 70.8 [60.8-79.8] years; 549 [49.7%] women; median [IQR] FIV, 24.9 mL [6.6-92.2 mL]), 42 of 287 patients (14.6%) with FIV of 7 mL or less (ie, ≤25th percentile) had an mRS score of at least 3; 65 of 275 patients (23.6%) with FIV of 92 mL or greater (ie, ≥75th percentile) had an mRS score of 2 or less. Prespecified models of pretreatment factors (ie, age, cancer, vascular risk factors) associated with low FIV and higher mRS score performed similarly to models selected by stepwise regression (AUC, 0.92 [95% CI, 0.89-0.95] vs 0.93 [95% CI, 0.90-0.95]; P = .42). SAEs, specifically infarct in new territory, recurrent stroke, pneumonia, and congestive heart failure, were associated with low FIV and higher mRS scores; stepwise models also identified 24-hour hemoglobin as treatment-related/posttreatment factor (AUC, 0.92 [95% CI, 0.90-0.95] vs 0.94 [95% CI, 0.91-0.96]; P = .14). Younger age was associated with high FIV and lower mRS score; stepwise models identified absence of diabetes and higher baseline hemoglobin as additional pretreatment factors (AUC, 0.76 [95% CI, 0.70-0.82] vs 0.77 [95% CI, 0.71-0.83]; P = .82). Absence of SAEs, especially stroke progression, symptomatic intracerebral hemorrhage, and pneumonia, was associated with high FIV and lower mRS score2; stepwise models also identified 24-hour hemoglobin level, glucose, and diastolic blood pressure as posttreatment factors associated with discrepant cases (AUC, 0.80 [95% CI, 0.74-0.87] vs 0.79 [95% CI, 0.72-0.86]; P = .92). In this study, discrepancies between functional outcome and post-EVT infarct volume were associated with differences in pretreatment factors, such as age and comorbidities, and posttreatment complications related to index stroke evolution, secondary prevention, and quality of stroke unit care. Besides preventing such complications, optimization of blood pressure, glucose levels, and hemoglobin levels are potentially modifiable factors meriting further study.