Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
55 result(s) for "Catheterization, Central Venous - utilization"
Sort by:
De-implementation strategy to Reduce the Inappropriate use of urinary and intravenous CATheters: study protocol for the RICAT-study
Background Urinary and (peripheral and central) intravenous catheters are widely used in hospitalized patients. However, up to 56% of the catheters do not have an appropriate indication and some serious complications with the use of these catheters can occur. The main objective of our quality improvement project is to reduce the use of catheters without an appropriate indication by 25–50%, and to evaluate the affecting factors of our de-implementation strategy. Methods In a multicenter, prospective interrupted time series analysis, several interventions to avoid inappropriate use of catheters will be conducted in seven hospitals in the Netherlands. Firstly, we will define a list of appropriate indications for urinary and (peripheral and central) intravenous catheters, which will restrict the use of catheters and urge catheter removal when the indication is no longer appropriate. Secondly, after the baseline measurements, the intervention will take place, which consists of a kick-off meeting, including a competitive feedback report of the baseline measurements, and education of healthcare workers and patients. Additional strategies based on the baseline data and local conditions are optional. The primary endpoint is the percentage of catheters with an inappropriate indication on the day of data collection before and after the de-implementation strategy. Secondary endpoints are catheter-related infections or other complications, catheter re-insertion rate, length of hospital (and ICU) stay and mortality. In addition, the cost-effectiveness of the de-implementation strategy will be calculated. Discussion This study aims to reduce the use of urinary and intravenous catheters with an inappropriate indication, and as a result reduce the catheter-related complications. If (cost-) effective it provides a tool for a nationwide approach to reduce catheter-related infections and other complications. Trial registration Dutch trial registry: NTR6015 . Registered 9 August 2016.
Long-term outcomes of totally implantable venous access devices
Purpose Identifying risk factors for premature totally implantable venous access device (TIVAD) catheter removal is crucial; however, because of the diversity of study methodologies, there is no consensus on such factors. The objective of the present study was to identify such risk factors by applying a cohort design study with a long-term follow-up period. Methods For this cohort study, we selected cancer patients who had newly implanted TIVADs between July 2008 and December 2008. The follow-up period lasted until September 2012. Univariate analysis was performed for age, gender, cancer type, TIVAD brand, puncture site, sidedness of puncture, and catheter tip position. The hazard ratio (HR) of potential risk factors was calculated using the Cox proportional hazards regression model, and Kaplan–Meier curves were applied for catheter survival analysis. Results Our study consisted of 240 people, with 5 people lost to follow-up. The cumulative premature catheter removal rate of all TIVADs was 9.8%, with the most common reason for premature removal being port-associated blood stream infection (PABSI), which proved to be highest in patients with hematology cancer (27.8%) and upper gastrointestinal cancer (19.4%). Suboptimal tip position (HR 5.13, 95% confidence interval 1.73–15.21) was also a risk factor for premature removal, and it was correlated with symptomatic TIVAD occlusion ( p  = 0.0004). Conclusions PABSI was the most common reason for premature catheter removal, with a varied incidence rate between different cancer types. Suboptimal tip position was also a risk factor. Confirming the final tip position after implantation is crucial. Infection control is important for TIVAD care, especially in high-risk cancer patients.
The impact of central line insertion bundle on central line-associated bloodstream infection
Background Knowledge about the impact of each central line insertion bundle on central line-associated bloodstream infection (CLABSI) is limited. Methods A quality-improvement intervention, including education, central venous catheter (CVC) insertion bundle, process and outcome surveillance, have been introduced since March 2013. Outcome surveillances, including CLABSI per 1,000 catheter-days, CLABSI per 1,000 inpatient-days, and catheter utilization rates (days of catheter use divided by total inpatient-days), were measured. As a baseline measurement for a comparison, we retrospectively collected data from March 1, 2012 to December 31, 2012. Results During this 10-month period, there were a total of 687 CVC insertions, and 627 (91.2%) insertions were performed by intensivists. The rate of CLABSI significantly declined from 1.65 per 1000 catheter-day during the pre-intervention period to 0.65 per 1000 catheter-day post-intervention period ( P  = 0.039). CLABSI more likely developed in subjects in which a maximal sterile barrier was not used compared with subjects in which it was used ( P  = 0.03). Moreover, CVC inserted by non-intensivists were more likely to become infected than CVC inserted by intensivists ( P  = 0.010). Conclusions This multidisciplinary infection control intervention, including a central line insertion care bundle, can effectively reduce the rate of CLABSI. The impact of different care bundle varies, and a maximal sterile barrier precaution during catheter insertion is an essential component of the care line insertion bundle.
Vital Signs: Central Line—Associated Blood Stream Infections — United States, 2001, 2008, and 2009
Background: Health-care—associated infections (HAIs) affect 5% of patients hospitalized in the United States each year. Central line—associated blood stream infections (CLABSIs) are important and deadly HAIs, with reported mortality of 12%–25%. This report provides national estimates of the number of CLABSIs among patients in intensive-care units (ICUs), inpatient wards, and outpatient hemodialysis facilities in 2008 and 2009 and compares ICU estimates with 2001 data. Methods: To estimate the total number of CLABSIs among patients aged ≥1 year in the United States, CDC multiplied central-line utilization and CLABSI rates by estimates of the total number of patient-days in each of three settings: ICUs, inpatient wards, and outpatient hemodialysis facilities. CDC identified total inpatient-days from the Healthcare Cost and Utilization Project's National Inpatient Sample and from the Hospital Cost Report Information System. Central-line utilization and CLABSI rates were obtained from the National Nosocomial Infections Surveillance System for 2001 estimates (ICUs only) and from the National Healthcare Safety Network (NHSN) for 2009 estimates (ICUs and inpatient wards). CDC estimated the total number of outpatient hemodialysis patient-days in 2008 using the single-day number of maintenance hemodialysis patients from the U.S. Renal Data System. Outpatient hemodialysis central-line utilization was obtained from the Fistula First Breakthrough Initiative, and hemodialysis CLABSI rates were estimated from NHSN. Annual pathogen-specific CLABSI rates were calculated for 2001–2009. Results: In 2001, an estimated 43,000 CLABSIs occurred among patients hospitalized in ICUs in the United States. In 2009, the estimated number of ICU CLABSIs had decreased to 18,000. Reductions in CLABSIs caused by Staphylococcus aureus were more marked than reductions in infections caused by gram-negative rods, Candida spp., and Enterococcus spp. In 2009, an estimated 23,000 CLABSIs occurred among patients in inpatient wards and, in 2008, an estimated 37,000 CLABSIs occurred among patients receiving outpatient hemodialysis. Conclusions: In 2009 alone, an estimated 25,000 fewer CLABSIs occurred in U.S. ICUs than in 2001, a 58% reduction. This represents up to 6,000 lives saved and $414 million in potential excess health-care costs in 2009 and approximately $1.8 billion in cumulative excess health-care costs since 2001. A substantial number of CLABSIs continue to occur, especially in outpatient hemodialysis centers and inpatient wards. Implications for Public Health Practice: Major reductions have occurred in the burden of CLABSIs in ICUs. State and federal efforts coordinated and supported by CDC, the Agency for Healthcare Research and Quality, and the Centers for Medicare & Medicaid Services and implemented by numerous health-care providers likely have helped drive these reductions. The substantial number of infections occurring in non-ICU settings, especially in outpatient hemodialysis centers, and the smaller decreases in non—S. aureus CLABSIs reveal important areas for expanded prevention efforts. Continued success in CLABSI prevention will require increased adherence to current CLABSI prevention recommendations, development and implementation of additional prevention strategies, and the ongoing collection and analysis of data, including specific microbiologic information. To prevent CLABSIs in hemodialysis patients, efforts to reduce central line use for hemodialysis and improve the maintenance of central lines should be expanded. The model of federal, state, facility, and health-care provider collaboration that has proven so successful in CLABSI prevention should be applied to other HAIs and other health-care—associated conditions.
Impact of early do-not-attempt-resuscitation orders on procedures and outcomes of severe sepsis
Do-not-attempt-resuscitation (DNAR) orders are common in severe sepsis, but the impact on clinical care is not known. Our primary objective was to determine the impact of early DNAR orders on in-hospital mortality and performance of key interventional procedures among severe sepsis hospitalizations. Our secondary objective was to further investigate what patient characteristics within the sepsis DNAR population affected outcomes. Using the 2010-2011 California State Inpatient Dataset, we analyzed hospitalizations for adults admitted through the emergency department with severe sepsis. Our primary predictor was a DNAR order, and our outcomes were in-hospital mortality and performance of interventional procedures. Visits with early DNAR orders accounted for 20.3% of severe sepsis hospitalizations. An early DNAR order was a strong, independent predictor of higher in-hospital mortality (odds ratio [OR], 4.03; 95% confidence interval, 3.88-4.19) and lower performance of critical procedures: central venous line (OR, 0.70), mechanical ventilation (OR, 0.80), hemodialysis (OR, 0.61), and major operative procedure (OR, 0.46). Among those with early DNAR orders, older age and rural location were the strongest predictors for a lack of interventional procedures. Although DNAR orders are not synonymous with “do not treat,” they may unintentionally limit aggressive treatment for severe sepsis patients, especially in older adults.
Use of port-a-cath in cancer patients: a single-center experience
Introduction: Central venous catheters play an important role in the management of cancer patients. Different types of devices are associated with different patterns of complications. We report on the pattern of use and rate of complications of port-a-caths in patients diagnosed with malignant cancer at a single institution. Methodology: The data were collected retrospectively from patients who received the treatment for solid tumors or lymphoma through a port-a-cath at the Sultan Qaboos University Hospital (SQUH) between January 2007 and February 2013. Results: A total of 117 port-a-caths were inserted in 106 patients. The majority (86; 73.5%) were implanted by an interventional radiologist, and the right internal jugular vein was accessed in 79 (67.5%) patients. Mean catheter indwelling time was 354 (range 3–1,876) days for all patients, 252 (3–1,876) and 389 days (13–1,139) for patients with and without complications, respectively. Thirty (25.6%) port-a-caths were removed prematurely, mainly due to infectious complications, while 17 (14.5%) were removed after completion of treatment. Staphylococcus aureus was the most frequently isolated organism, found in 8 (6.8%) patients. Underlying diagnosis (p < 0.001), chemotherapy regimen (p < 0.001), sensitivity to antibiotics (p = 0.01), and any complication (p < 0.001) were significant factors affecting the duration of port-a-cath use. None of these factors were significant on multivariate cox regression analysis. Conclusions: The mean duration of port-a-cath use was almost one year. Infection was the most common complication leading to premature removal, followed by port thrombosis.
Recipes for checklists and bundles: one part active ingredient, two parts measurement
Furthermore, CVC-BSIs occurring within 48 h of admission showed comparable declines.\\n When hospitalised patients develop unexplained shortness of breath, clinicians routinely consider the possibility of pulmonary embolism; but, they will only test for this diagnosis in the most obvious cases if patients are consistently receiving pharmacologic thromboembolism prophylaxis. [...]evaluation of an intervention to improve venous thromboembolism prophylaxis that relied on the decisions of clinicians to pursue the diagnosis would show improvements in the outcome simply on the basis of decreased testing. [...]the sterility of a randomised controlled trial is a luxury most real-world improvement interventions cannot provide.
Trans-anastomotic tubes reduce the need for central venous access and parenteral nutrition in infants with congenital duodenal obstruction
Purpose To determine the effect of trans-anastomotic tube (TAT) feeding on outcome following repair of congenital duodenal obstruction (CDO). Methods Retrospective comparative study of all infants with CDO over 10 years. Data are median (range). Mann–Whitney U test and Fisher’s exact test were used. Results Of 55 infants with CDO (48 atresia, 7 stenosis), 17 were managed with a TAT, 38 without. Enteral feeds were commenced earlier in infants with a TAT compared to those without (TAT 2 days post-repair [1–4] vs. no-TAT 3 days post-repair [1–7]; p  = 0.006). Infants with a TAT achieved full enteral feeds significantly sooner than those without (TAT 6 days post-repair [2–12] vs. no-TAT 9 days post-repair [3–36]; p  = 0.005). Significantly fewer infants in the TAT group required central venous catheter (CVC) placement and parenteral nutrition (PN) than in the no-TAT group (TAT 2/17 vs. no-TAT 28/38, p  < 0.0001). There were six CVC-related complications (5 infections, 1 PN extravasation) and four TATs became displaced and were removed before achieving full enteral feeds. One infant with a TAT with trisomy 21 and undiagnosed Hirschsprung disease developed an anastomotic leak and jejunal perforation requiring re-operation. Conclusions A TAT significantly shortens time to full enteral feeds in infants with CDO significantly reducing the need for central venous access and PN.
Adult intraosseous use in academic EDs and simulated comparison of emergent vascular access techniques
Time to delivery of medications, fluids, and blood products can be vital for survival. Because of ease and speed, pediatric advanced life support (PALS) training advises IO access if intravenous (IV) access is unsuccessful after 2 attempts [2]. To compare emergent vascular access techniques, a convenience sample of 41 EM residents, faculty, and physician assistants from our academic institution volunteered to place a simulated femoral central line, ultrasound-guided IV, and proximal tibia IO.
Agreement of reported vascular access on the medical evidence report and on medicare claims at hemodialysis initiation
Background The choice of vascular access type is an important aspect of care for incident hemodialysis patients. However, data from the Centers for Medicare & Medicaid Services (CMS) Medical Evidence Report (form CMS-2728) identifying the first access for incident patients have not previously been validated. Medicare began requiring that vascular access type be reported on claims in July 2010. We aimed to determine the agreement between the reported vascular access at initiation from form CMS-2728 and from Medicare claims. Methods This retrospective study used a cohort of 9777 patients who initiated dialysis in the latter half of 2010 and were eligible for Medicare at the start of renal replacement therapy to compare the vascular access type reported on form CMS-2728 with the type reported on Medicare outpatient dialysis claims for the same patients. For each patient, the reported access from each data source was compiled; the percent agreement represented the percent of patients for whom the access was the same. Multivariate logistic analysis was performed to identify characteristics associated with the agreement of reported access. Results The two data sources agreed for 94% of patients, with a Kappa statistic of 0.83, indicating an excellent level of agreement. Further, we found no evidence to suggest that agreement was associated with the patient characteristics of age, sex, race, or primary cause of renal failure. Conclusion These results suggest that vascular access data as reported on form CMS-2728 are valid and reliable for use in research studies.