Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
59 result(s) for "Taaffe, Kevin"
Sort by:
Work systems analysis of sterile processing: decontamination
BackgroundFew studies have explored the work of sterile processing departments (SPD) from a systems perspective. Effective decontamination is critical for removing organic matter and reducing microbial levels from used surgical instruments prior to disinfection or sterilisation and is delivered through a combination of human work and supporting technologies and processes.ObjectiveIn this paper we report the results of a work systems analysis that sought to identify the complex multilevel interdependencies that create performance variation in decontamination and identify potential improvement interventions.MethodsThe research was conducted at a 700-bed academic hospital with two reprocessing facilities decontaminating approximately 23 000 units each month. Mixed methods, including 56 hours of observations of work as done, formal and informal interviews with relevant stakeholders and analysis of data collected about the system, were used to iteratively develop a process map, task analysis, abstraction hierarchy and a variance matrix.ResultsWe identified 21 different performance shaping factors, 30 potential failures, 16 types of process variance, and 10 outcome variances in decontamination. Approximately 2% of trays were returned to decontamination from assembly, while decontamination problems were found in about 1% of surgical cases. Staff knowledge, production pressures, instrument design, tray composition and workstation design contributed to outcomes such as reduced throughput, tray defects, staff injuries, increased inventory and equipment costs, and patient injuries.ConclusionsEnsuring patients and technicians’ safety and efficient SPD operation requires improved design of instruments and the decontamination area, skilled staff, proper equipment maintenance and effective coordination of reprocessing tasks.
Work systems analysis of sterile processing: assembly
BackgroundSterile processing departments (SPDs) play a crucial role in surgical safety and efficiency. SPDs clean instruments to remove contaminants (decontamination), inspect and reorganise instruments into their correct trays (assembly), then sterilise and store instruments for future use (sterilisation and storage). However, broken, missing or inappropriately cleaned instruments are a frequent problem for surgical teams. These issues should be identified and corrected during the assembly phase.ObjectiveA work systems analysis, framed within the Systems Engineering Initiative for Patient Safety (SEIPS) model, was used to develop a comprehensive understanding of the assembly stage of reprocessing, identify the range of work challenges and uncover the inter-relationship among system components influencing reliable instrument reprocessing.MethodsThe study was conducted at a 700-bed academic hospital in the Southeastern United States with two reprocessing facilities from October 2017 to October 2018. Fifty-six hours of direct observations, 36 interviews were used to iteratively develop the work systems analysis. This included the process map and task analysis developed to describe the assembly system, the abstraction hierarchy developed to identify the possible performance shaping factors (based on SEIPS) and a variance matrix developed to illustrate the relationship among the tasks, performance shaping factors, failures and outcomes. Operating room (OR) reported tray defect data from July 2016 to December 2017 were analysed to identify the percentage and types of defects across reprocessing phases the most common assembly defects.ResultsThe majority of the 3900 tray defects occurred during the assembly phase; impacting 5% of surgical cases (n=41 799). Missing instruments, which could result in OR delays and increased surgical duration, were the most commonly reported assembly defect (17.6%, n=700). High variability was observed in the reassembling of trays with failures including adding incorrect instruments, omitting instruments and failing to remove damaged instrument. These failures were precipitated by technological shortcomings, production pressures, tray composition, unstandardised instrument nomenclature and inadequate SPD staff training.ConclusionsSupporting patient safety, minimising tray defects and OR delays and improving overall reliability of instrument reprocessing require a well-designed instrument tracking system, standardised nomenclature, effective coordination of reprocessing tasks between SPD and the OR and well-trained sterile processing technicians.
Risk-adjusted policies to minimise perioperative staffing shortages during a pandemic: An agent-based simulation study
Healthcare workers’ (HCWs) safety and availability to care for patients are critical during a pandemic such as the one caused by severe acute respiratory syndrome coronavirus 2. Among providers of different specialities, it is critical to protect those working in hospital settings with a high risk of infection. Using an agent-based simulation model, various staffing policies were developed and simulated for 90 days using data from the largest health systems in South Carolina. The model considers staffing policies that include geographic segregation, interpersonal contact limits, and a combination of factors, including the patient census, transmission rates, vaccination status of providers, hospital capacity, incubation time, quarantine period, and interactions between patients and providers. Comparing the existing practices to various risk-adjusted staffing policies, model predictions show that restricted teaming and rotating schedules significantly (p-value <0.01) reduced weekly HCW unavailability and the number of infected HCWs by 22% and 38%, respectively, when the vaccination rates among HCWs were lower (<75%). However, as the vaccination rate increases, the benefits of risk-adjusted policies diminish; and when 90% of HCWs were vaccinated, there were no significant (p-value = 0.09) benefits. Although these simulated outcomes are specific to one health system, our findings can be generalised to other health systems with multiple locations.
Minor flow disruptions, traffic-related factors and their effect on major flow disruptions in the operating room
BackgroundStudies in operating rooms (OR) show that minor disruptions tend to group together to result in serious adverse events such as surgical errors. Understanding the characteristics of these minor flow disruptions (FD) that impact major events is important in order to proactively design safer systemsObjectiveThe purpose of this study is to use a systems approach to investigate the aetiology of minor and major FDs in ORs in terms of the people involved, tasks performed and OR traffic, as well as the location of FDs and other environmental characteristics of the OR that may contribute to these disruptions.MethodsUsing direct observation and classification of FDs via video recordings of 28 surgical procedures, this study modelled the impact of a range of system factors—location of minor FDs, roles of staff members involved in FDs, type of staff activities as well as OR traffic-related factors—on major FDs in the OR.Results The rate of major FDs increases as the rate of minor FDs increases, especially in the context of equipment-related FDs, and specific physical locations in the OR. Circulating nurse-related minor FDs and minor FDs that took place in the transitional zone 2, near the foot of the surgical table, were also related to an increase in the rate of major FDs. This study also found that more major and minor FDs took place in the anaesthesia zone compared with all other OR zones. Layout-related disruptions comprised more than half of all observed FDs.ConclusionRoom design and layout issues may create barriers to task performance, potentially contributing to the escalation of FDs in the OR.
Allocating operating room block time using historical caseload variability
Operating room (OR) allocation and planning is one of the most important strategic decisions that OR managers face. The number of ORs that a hospital opens depends on the number of blocks that are allocated to the surgical groups, services, or individual surgeons, combined with the amount of open posting time (i.e., first come, first serve posting) that the hospital wants to provide. By allocating too few ORs, a hospital may turn away surgery demand whereas opening too many ORs could prove to be a costly decision. The traditional method of determining block frequency and size considers the average historical surgery demand for each group. However, given that there are penalties to the system for having too much or too little OR time allocated to a group, demand variability should play a role in determining the real OR requirement. In this paper we present an algorithm that allocates block time based on this demand variability, specifically accounting for both over-utilized time (time used beyond the block) and under-utilized time (time unused within the block). This algorithm provides a solution to the situation in which total caseload demand can be accommodated by the total OR resource set, in other words not in a capacity-constrained situation. We have found this scenario to be common among several regional healthcare providers with large OR suites and excess capacity. This algorithm could be used to adjust existing blocks or to assign new blocks to surgeons that did not previously have a block. We also have studied the effect of turnover time on the number of ORs that needs to be allocated. Numerical experiments based on real data from a large health-care provider indicate the opportunity to achieve over 2,900 hours of OR time savings through improved block allocations.
Simulating the effects of operating room staff movement and door opening policies on microbial load
To identify factors that increase the microbial load in the operating room (OR) and recommend solutions to minimize the effect of these factors. Observation and sampling study. Academic health center, public hospitals. We analyzed 4 videotaped orthopedic surgeries (15 hours in total) for door openings and staff movement. The data were translated into a script denoting a representative frequency and location of movements for each OR team member. These activities were then simulated for 30 minutes per trial in a functional operating room by the researchers re-enacting OR staff-member roles, while collecting bacteria and fungi using settle plates. To test the hypotheses on the influence of activity on microbial load, an experimental design was created in which each factor was tested at higher (and lower) than normal activity settings for a 30-minute period. These trials were conducted in 2 phases. The frequency of door opening did not independently affect the microbial load in the OR. However, a longer duration and greater width of door opening led to increased microbial load in the OR. Increased staff movement also increased the microbial load. There was a significantly higher microbial load on the floor than at waist level. Movement of staff and the duration and width of door opening definitely affects the OR microbial load. However, further investigation is needed to determine how the number of staff affects the microbial load and how to reduce the microbial load at the surgical table.
First Case On-Time Starts Measured by Incision On-Time and No Grace Period
A delay in first case on-time starts (FCOTS) can lead to less operating room (OR) utilization, greater facility costs, and dissatisfaction among staff and patients. FCOTS is usually measured by the patient in-room metric with a small grace period. For this study, the partnering hospital elected to target and improve delays by aggressively defining FCOTS as time of incision with no grace period. Metric standardization, goal setting, and organizational focus contributed to a 9-month implementation plan to improve the newly defined FCOTS metric. The target was achieved during implementation, with 73.6% of first cases starting on time. Annual impact showed 80,587 min, or 1,343 hr, of saved OR time, which led to $771,000 in annual savings for variable OR labor costs. This redefined metric and related interventions contributed to significant reduction in delays and savings to the hospital. Engaged physician leadership played a key role in this improvement initiative, as well. The methods employed here can be used in other hospitals looking to improve FCOTS metrics in their procedural areas.
Performance metrics analysis for aircraft maintenance process control
Purpose – Performance measurements or metrics are that which measure a company's performance and behavior, and are used to help an organization achieve and maintain success. Without the use of performance metrics, it is difficult to know whether or not the firm is meeting requirements or making desired improvements. During the course of this study with Lockheed Martin, the research team was tasked with determining the effectiveness of the site's existing performance metrics that are used to help an organization achieve and maintain success. Without the use of performance metrics, it is difficult to know whether or not the firm is meeting requirements or making desired improvements. The paper aims to discuss these issues. Design/methodology/approach – Research indicates that there are five key elements that influence the success of a performance metric. A standardized method of determining whether or not a metric has the right mix of these elements was created in the form of a metrics scorecard. Findings – The scorecard survey was successful in revealing good metric use, as well as problematic metrics. In the quality department, the Document Rejects metric has been reworked and is no longer within the executive's metric deck. It was also recommended to add root cause analysis, and to quantify and track the cost of non-conformance and the overall cost of quality. In total, the number of site wide metrics has decreased from 75 to 50 metrics. The 50 remaining metrics are undergoing a continuous improvement process in conjunction with the use of the metric scorecard tool developed in this research. Research limitations/implications – The metrics scorecard should be used site-wide for an assessment of all metrics. The focus of this paper is on the metrics within the quality department. Practical implications – Putting a quick and efficient metrics assessment technique in place was critical. With the leadership and participation of Lockheed Martin, this goal was accomplished. Originality/value – This paper presents the process of metrics evaluation and the issues that were encountered during the process, including insights that would not have been easily documented without this mechanism. Lockheed Martin Company has used results from this research. Other industries could also apply the methods proposed here.
Prioritization strategies for patient evacuations
Evacuation from a health care facility is considered last resort, and in the event of a complete evacuation, a standard planning assumption is that all patients will be evacuated. A literature review of the suggested prioritization strategies for evacuation planning—as well as the transportation priorities used in actual facility evacuations—shows a lack of consensus about whether critical or non-critical care patients should be transferred first. In addition, it is implied that these policies are “greedy” in that one patient group is given priority, and patients from that group are chosen to be completely evacuated before any patients are evacuated from the other group. The purpose of this paper is to present a dynamic programming model for emergency patient evacuations and show that a greedy, “all-or-nothing” policy is not always optimal as well as discuss insights of the resulting optimal prioritization strategies for unit- or floor-level evacuations.