Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
21 result(s) for "Bunce, Arwen"
Sort by:
Lessons learned about the effective operationalization of champions as an implementation strategy: results from a qualitative process evaluation of a pragmatic trial
Background Though the knowledge base on implementation strategies is growing, much remains unknown about how to most effectively operationalize these strategies in diverse contexts. For example, while evidence shows that champions can effectively support implementation efforts in some circumstances, little has been reported on how to operationalize this role optimally in different settings, or on the specific pathways through which champions enact change. Methods This is a secondary analysis of data from a pragmatic trial comparing implementation strategies supporting the adoption of guideline-concordant cardioprotective prescribing in community health centers in the USA. Quantitative data came from the community health centers’ shared electronic health record; qualitative data sources included community health center staff interviews over 3 years. Using a convergent mixed-methods design, data were collected concurrently and merged for interpretation to identify factors associated with improved outcomes. Qualitative analysis was guided by the constant comparative method. As results from the quantitative and initial qualitative analyses indicated the essential role that champions played in promoting guideline-concordant prescribing, we conducted multiple immersion-crystallization cycles to better understand this finding. Results Five community health centers demonstrated statistically significant increases in guideline-concordant cardioprotective prescribing. A combination of factors appeared key to their successful practice change: (1) A clinician champion who demonstrated a sustained commitment to implementation activities and exhibited engagement, influence, credibility, and capacity; and (2) organizational support for the intervention. In contrast, the seven community health centers that did not show improved outcomes lacked a champion with the necessary characteristics, and/or organizational support. Case studies illustrate the diverse, context-specific pathways that enabled or prevented study implementers from advancing practice change. Conclusion This analysis confirms the important role of champions in implementation efforts and offers insight into the context-specific mechanisms through which champions enact practice change. The results also highlight the potential impact of misaligned implementation support and key modifiable barriers and facilitators on implementation outcomes. Here, unexamined assumptions and a lack of evidence-based guidance on how best to identify and prepare effective champions led to implementation support that failed to address important barriers to intervention success. Trial registration ClinicalTrials.gov , NCT02325531 . Registered 15 December 2014.
Strengthening methods for tracking adaptations and modifications to implementation strategies
Background Developing effective implementation strategies requires adequate tracking and reporting on their application. Guidelines exist for defining and reporting on implementation strategy characteristics, but not for describing how strategies are adapted and modified in practice. We built on existing implementation science methods to provide novel methods for tracking strategy modifications. Methods These methods were developed within a stepped-wedge trial of an implementation strategy package designed to help community clinics adopt social determinants of health-related activities: in brief, an ‘Implementation Support Team’ supports clinics through a multi-step process. These methods involve five components: 1) describe planned strategy; 2) track its use; 3) monitor barriers; 4) describe modifications; and 5) identify / describe new strategies. We used the Expert Recommendations for Implementing Change taxonomy to categorize strategies, Proctor et al.’s reporting framework to describe them, the Consolidated Framework for Implementation Research to code barriers / contextual factors necessitating modifications, and elements of the Framework for Reporting Adaptations and Modifications-Enhanced to describe strategy modifications. Results We present three examples of the use of these methods: 1) modifications made to a facilitation-focused strategy (clinics reported that certain meetings were too frequent, so their frequency was reduced in subsequent wedges); 2) a clinic-level strategy addition which involved connecting one study clinic seeking help with community health worker-related workflows to another that already had such a workflow in place; 3) a study-level strategy addition which involved providing assistance in overcoming previously encountered (rather than de novo) challenges. Conclusions These methods for tracking modifications made to implementation strategies build on existing methods, frameworks, and guidelines; however, as none of these were a perfect fit, we made additions to several frameworks as indicated, and used certain frameworks’ components selectively. While these methods are time-intensive, and more work is needed to streamline them, they are among the first such methods presented to implementation science. As such, they may be used in research on assessing effective strategy modifications and for replication and scale-up of effective strategies. We present these methods to guide others seeking to document implementation strategies and modifications to their studies. Trial registration clinicaltrials.gov ID: NCT03607617 (first posted 31/07/2018).
Applying Realist Retroduction to EHR-Based Clinical Decision Support Tool Development
The application of realist-informed approaches to implementation research can produce answers to why, for whom and under what circumstances social determinants of health interventions work. In the context of a study to develop and test EHR-based clinical decision support tools that suggest adjusting care plans in response to patient-reported financial, housing, food, transportation, and utilities insecurity, the authors applied an innovative use of realist principles in a bounded, mid-study task. This paper demonstrates how realist retroduction can be applied in intervention development processes. Retroduction proved useful in identifying the often intangible clinical needs and preferences that affected decision support tool desirability and use, which then guided the revision of five tools prior to a formal trial. This paper illustrates how data from the study development phases were put in service of retroductive steps that, through the identification of tentative program theories, guided revision of the pilot electronic tools to better meet clinic needs in the study trial phase. Applying retroductive thinking to establish what may be more or less effective under real-world conditions before participants are recruited is a productive, pragmatic form of researcher/stakeholder co-design that seeks to achieve results without wasting clinical teams’ time.
Reporting on the Strategies Needed to Implement Proven Interventions: An Example From a “Real-World” Cross-Setting Implementation Study
The objective of this study was to empirically demonstrate the use of a new framework for describing the strategies used to implement quality improvement interventions and provide an example that others may follow. Implementation strategies are the specific approaches, methods, structures, and resources used to introduce and encourage uptake of a given intervention's components. Such strategies have not been regularly reported in descriptions of interventions' effectiveness, or in assessments of how proven interventions are implemented in new settings. This lack of reporting may hinder efforts to successfully translate effective interventions into “real-world” practice. A recently published framework was designed to standardize reporting on implementation strategies in the implementation science literature. We applied this framework to describe the strategies used to implement a single intervention in its original commercial care setting, and when implemented in community health centers from September 2010 through May 2015. Per this framework, the target (clinic staff) and outcome (prescribing rates) remained the same across settings; the actor, action, temporality, and dose were adapted to fit local context. The framework proved helpful in articulating which of the implementation strategies were kept constant and which were tailored to fit diverse settings, and simplified our reporting of their effects. Researchers should consider consistently reporting this information, which could be crucial to the success or failure of implementing proven interventions effectively across diverse care settings. clinicaltrials.gov Identifier: NCT02299791.
Study protocol: a pragmatic, stepped-wedge trial of tailored support for implementing social determinants of health documentation/action in community health centers, with realist evaluation
Background National leaders recommend documenting social determinants of health and actions taken to address social determinants of health in electronic health records, and a growing body of evidence suggests the health benefits of doing so. However, little evidence exists to guide implementation of social determinants of health documentation/action. Methods This paper describes a 5-year, mixed-methods, stepped-wedge trial with realist evaluation, designed to test the impact of providing 30 community health centers with step-by-step guidance on implementing electronic health record-based social determinants of health documentation. This guidance will entail 6 months of tailored support from an interdisciplinary team, including training and technical assistance. We will report on tailored support provided at each of five implementation steps; impact of tailored implementation support; a method for tracking such tailoring; and context-specific pathways through which these tailored strategies effect change. We will track the competencies and resources needed to support the study clinics’ implementation efforts. Discussion Results will inform how to tailor implementation strategies to meet local needs in real-world practice settings. Secondary analyses will assess impacts of social determinants of health documentation and referral-making on diabetes outcomes. By learning whether and how scalable, tailored implementation strategies help community health centers adopt social determinants of health documentation and action, this study will yield timely guidance to primary care providers. We are not aware of previous studies exploring implementation strategies that support adoption of social determinants of action using electronic health and interventions, despite the pressing need for such guidance. Trial registration clinicaltrials.gov, NCT03607617 , registration date: 7/31/2018—retrospectively registered
Using Electronic Health Record–Based Clinical Decision Support to Provide Social Risk–Informed Care in Community Health Centers: Protocol for the Design and Assessment of a Clinical Decision Support Tool
Background: Consistent and compelling evidence demonstrates that social and economic adversity has an impact on health outcomes. In response, many health care professional organizations recommend screening patients for experiences of social and economic adversity or social risks—for example, food, housing, and transportation insecurity—in the context of care. Guidance on how health care providers can act on documented social risk data to improve health outcomes is nascent. A strategy recommended by the National Academy of Medicine involves using social risk data to adapt care plans in ways that accommodate patients’ social risks. Objective: This study’s aims are to develop electronic health record (EHR)–based clinical decision support (CDS) tools that suggest social risk–informed care plan adaptations for patients with diabetes or hypertension, assess tool adoption and its impact on selected clinical quality measures in community health centers, and examine perceptions of tool usability and impact on care quality. Methods: A systematic scoping review and several stakeholder activities will be conducted to inform development of the CDS tools. The tools will be pilot-tested to obtain user input, and their content and form will be revised based on this input. A randomized quasi-experimental design will then be used to assess the impact of the revised tools. Eligible clinics will be randomized to a control group or potential intervention group; clinics will be recruited from the potential intervention group in random order until 6 are enrolled in the study. Intervention clinics will have access to the CDS tools in their EHR, will receive minimal implementation support, and will be followed for 18 months to evaluate tool adoption and the impact of tool use on patient blood pressure and glucose control. Results: This study was funded in January 2020 by the National Institute on Minority Health and Health Disparities of the National Institutes of Health. Formative activities will take place from April 2020 to July 2021, the CDS tools will be developed between May 2021 and November 2022, the pilot study will be conducted from August 2021 to July 2022, and the main trial will occur from December 2022 to May 2024. Study data will be analyzed, and the results will be disseminated in 2024. Conclusions: Patients’ social risk information must be presented to care teams in a way that facilitates social risk–informed care. To our knowledge, this study is the first to develop and test EHR-embedded CDS tools designed to support the provision of social risk–informed care. The study results will add a needed understanding of how to use social risk data to improve health outcomes and reduce disparities. International Registered Report Identifier (IRRID): PRR1-10.2196/31733
Ethnographic process evaluation in primary care: explaining the complexity of implementation
Background The recent growth of implementation research in care delivery systems has led to a renewed interest in methodological approaches that deliver not only intervention outcome data but also deep understanding of the complex dynamics underlying the implementation process. We suggest that an ethnographic approach to process evaluation, when informed by and integrated with quantitative data, can provide this nuanced insight into intervention outcomes. The specific methods used in such ethnographic process evaluations are rarely presented in detail; our objective is to stimulate a conversation around the successes and challenges of specific data collection methods in health care settings. We use the example of a translational clinical trial among 11 community clinics in Portland, OR that are implementing an evidence-based, health-information technology (HIT)-based intervention focused on patients with diabetes. Discussion Our ethnographic process evaluation employed weekly diaries by clinic-based study employees, observation, informal and formal interviews, document review, surveys, and group discussions to identify barriers and facilitators to implementation success, provide insight into the quantitative study outcomes, and uncover lessons potentially transferable to other implementation projects. These methods captured the depth and breadth of factors contributing to intervention uptake, while minimizing disruption to clinic work and supporting mid-stream shifts in implementation strategies. A major challenge is the amount of dedicated researcher time required. Summary The deep understanding of the ‘how’ and ‘why’ behind intervention outcomes that can be gained through an ethnographic approach improves the credibility and transferability of study findings. We encourage others to share their own experiences with ethnography in implementation evaluation and health services research, and to consider adapting the methods and tools described here for their own research.
Unintended consequences: a qualitative study exploring the impact of collecting implementation process data with phone interviews on implementation activities
Background Qualitative data are crucial for capturing implementation processes, and thus necessary for understanding implementation trial outcomes. Typical methods for capturing such data include observations, focus groups, and interviews. Yet little consideration has been given to how such methods create interactions between researchers and study participants, which may affect participants’ engagement, and thus implementation activities and study outcomes. In the context of a clinical trial, we assessed whether and how ongoing telephone check-ins to collect data about implementation activities impacted the quality of collected data, and participants’ engagement in study activities. Methods Researchers conducted regular phone check-ins with clinic staff serving as implementers in an implementation study. Approximately 1 year into this trial, 19 of these study implementers were queried about the impact of these calls on study engagement and implementation activities. The two researchers who collected implementation process data through phone check-ins with the study implementers were also interviewed about their perceptions of the impact of the check-ins. Results Study implementers’ assessment of the check-ins’ impact fell into three categories: (1) the check-ins had no effect on implementation activities, (2) the check-ins served as a reminder about study participation (without relating a clear impact on implementation activities), and (3) the check-ins caused changes in implementation activities. The researchers similarly perceived that the phone check-ins served as reminders and encouraged some implementers’ engagement in implementation activities; their ongoing nature also created personal connections with study implementers that may have impacted implementation activities. Among some study implementers, anticipation of the check-in calls also improved their ability to recount implementation activities and positively affected quality of the data collected. Conclusion These results illustrate the potential impact of qualitative data collection on implementation activities during implementation science trials. Mitigating such effects may prove challenging, but acknowledging these consequences—or even embracing them, perhaps by designing data collection methods as implementation strategies—could enhance scientific rigor. This work is presented to stimulate debate about the complexities involved in capturing data on implementation processes using common qualitative data collection methods. Trial registration ClinicalTrials.gov, NCT02325531 . Registered 15 December 2014.
Does increased implementation support improve community clinics’ guideline-concordant care? Results of a mixed methods, pragmatic comparative effectiveness trial
Background Disseminating care guidelines into clinical practice remains challenging, partly due to inadequate evidence on how best to help clinics incorporate new guidelines into routine care. This is particularly true in safety net community health centers (CHCs). Methods This pragmatic comparative effectiveness trial used a parallel mixed methods design. Twenty-nine CHC clinics were randomized to receive increasingly intensive implementation support (implementation toolkit (arm 1); toolkit + in-person training + training webinars (arm 2); toolkit + training + webinars + offered practice facilitation (arm 3)) targeting uptake of electronic health record (EHR) tools focused on guideline-concordant cardioprotective prescribing for patients with diabetes. Outcomes were compared across study arms, to test whether increased support yielded additive improvements, and with 137 non-study CHCs that share the same EHR as the study clinics. Quantitative data from the CHCs’ EHR were used to compare the magnitude of change in guideline-concordant ACE/ARB and statin prescribing, using adjusted Poisson regressions. Qualitative data collected using diverse methods (e.g., interviews, observations) identified factors influencing the quantitative outcomes. Results Outcomes at CHCs receiving higher-intensity support did not improve in an additive pattern. ACE/ARB prescribing did not improve in any CHC group. Statin prescribing improved overall and was significantly greater only in the arm 1 and arm 2 CHCs compared with the non-study CHCs. Factors influencing the finding of no additive impact included: aspects of the EHR tools that reduced their utility, barriers to providing the intended implementation support, and study design elements, e.g., inability to adapt the provided support. Factors influencing overall improvements in statin outcomes likely included a secular trend in awareness of statin prescribing guidelines, selection bias where motivated clinics volunteered for the study, and study participation focusing clinic staff on the targeted outcomes. Conclusions Efforts to implement care guidelines should: ensure adaptability when providing implementation support and conduct formative evaluations to determine the optimal form of such support for a given clinic; consider how study data collection influences adoption; and consider barriers to clinics’ ability to use/accept implementation support as planned. More research is needed on supporting change implementation in under-resourced settings like CHCs. Trial registration ClinicalTrials.gov , NCT02325531. Registered 15 December 2014.
Recommended practices for computerized clinical decision support and knowledge management in community settings: a qualitative study
Background The purpose of this study was to identify recommended practices for computerized clinical decision support (CDS) development and implementation and for knowledge management (KM) processes in ambulatory clinics and community hospitals using commercial or locally developed systems in the U.S. Methods Guided by the Multiple Perspectives Framework, the authors conducted ethnographic field studies at two community hospitals and five ambulatory clinic organizations across the U.S. Using a Rapid Assessment Process, a multidisciplinary research team: gathered preliminary assessment data; conducted on-site interviews, observations, and field surveys; analyzed data using both template and grounded methods; and developed universal themes. A panel of experts produced recommended practices. Results The team identified ten themes related to CDS and KM. These include: 1) workflow; 2) knowledge management; 3) data as a foundation for CDS; 4) user computer interaction; 5) measurement and metrics; 6) governance; 7) translation for collaboration; 8) the meaning of CDS; 9) roles of special, essential people; and 10) communication, training, and support. Experts developed recommendations about each theme. The original Multiple Perspectives framework was modified to make explicit a new theoretical construct, that of Translational Interaction. Conclusions These ten themes represent areas that need attention if a clinic or community hospital plans to implement and successfully utilize CDS. In addition, they have implications for workforce education, research, and national-level policy development. The Translational Interaction construct could guide future applied informatics research endeavors.