Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Series Title
      Series Title
      Clear All
      Series Title
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Content Type
    • Item Type
    • Is Full-Text Available
    • Subject
    • Publisher
    • Source
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
6,367 result(s) for "Public administration Research Methodology."
Sort by:
Networks and collaboration in the public sector : essential research approaches, methodologies and analytic tools
\"Networks and other collaborations are central to the public sector's ability to respond to their diverse responsibilities, from international development and regional governance, to policy development and service provision. Great strides have been made towards understanding their formation, governance and management, but more opportunities to explore methodologies and measures is required to ensure they are properly understood. This volume showcases an array of selected research methods and analytics tools currently used by scholars and practitioners in network and collaboration research, as well as emerging styles of empirical investigation. Although it cannot attempt to capture all technical details for each one, this book provides a unique catalogue of compelling methods for researchers and practitioners, which are illustrated extensively with applications in the public and non-profit sector. By bringing together leading and upcoming scholars in network research, the book will be of enormous assistance in guiding students and scholars in public management to study collaboration and networks empirically by demonstrating the core research approaches and tools for investigating and evaluating these crucially important arrangements\"-- Provided by publisher.
Research Methods for Public Administrators
Without jargon or mathematical theory to hinder a quick understanding and use, here are the research tools and techniques you can grasp and immediately apply to obtain research services from others or do research yourself. Johnson makes clear that to succeed in any public agency management position, you have to be able to think analytically and know how to assess the quality of research results. By providing the underlying concepts and just enough methodology to operationalize them, she gives you exactly what you need—in a clear, straightforward way that takes the fear out of learning. You will find here an especially wide range of practical guidelines and examples, all from the author's own and others' experiences in a variety of settings within the public sector. Throughout her book she emphasizes the how of research—how to do it, how to make sense of its findings—and covers all the basic statistical tools, concetrating steadily on interpreting research results. An important, reader-friendly text for students of public administration, and for their often perplexed colleagues already on the job. Johnson explains that public administrators do not do research themselves all that often. But with the rising demand for results measurement, balancing scorecards, benchmarking and assessing customer satisfaction, they do need to understand the basics of what research is and at least have more than just a glimmer of how it is done. Her book offers both—a simple, easily grasped presentation of research concepts and principles, plus all of the essentials of doing program evaluation, policy analysis, and applied social science. It is especially useful as a text in such courses as research methods, program evaluation and introduction to applied statistics, usually found in public administration programs at the undergraduate and graduate levels. And for people already in jobs outside the academic community, people who are now asked to do tasks that they seldom did before—and never expected they would be asked to do—it is essential.
Comparative policy studies : conceptual and methodological challenges
\"The role of comparative analysis in policy studies has gained increasing importance in recent years. Comparative policy studies aims at comparing and contrasting public policy making across sectoral, regional and national boundaries in order to overcome challenges in the formulation, implementation and evaluation of public policy. This book is the first in its field to provide scholars and policy-makers with both compelling comparative research design and methodology in one place. In contrast to general manuals on comparative methodology, this book specifically addresses key research design and methodological challenges that comparative policy studies typically faces and draws on rich empirical illustrations. Written by an outstanding cast of contributors, this volume is essential reading for scholars and students of comparative public policy. \"-- Provided by publisher.
Working Together
Advances in the social sciences have emerged through a variety of research methods: field-based research, laboratory and field experiments, and agent-based models. However, which research method or approach is best suited to a particular inquiry is frequently debated and discussed.Working Togetherexamines how different methods have promoted various theoretical developments related to collective action and the commons, and demonstrates the importance of cross-fertilization involving multimethod research across traditional boundaries. The authors look at why cross-fertilization is difficult to achieve, and they show ways to overcome these challenges through collaboration. The authors provide numerous examples of collaborative, multimethod research related to collective action and the commons. They examine the pros and cons of case studies, meta-analyses, large-N field research, experiments and modeling, and empirically grounded agent-based models, and they consider how these methods contribute to research on collective action for the management of natural resources. Using their findings, the authors outline a revised theory of collective action that includes three elements: individual decision making, microsituational conditions, and features of the broader social-ecological context. Acknowledging the academic incentives that influence and constrain how research is conducted,Working Togetherreworks the theory of collective action and offers practical solutions for researchers and students across a spectrum of disciplines.
Quality assessment with diverse studies (QuADS): an appraisal tool for methodological and reporting quality in systematic reviews of mixed- or multi-method studies
Background In the context of the volume of mixed- and multi-methods studies in health services research, the present study sought to develop an appraisal tool to determine the methodological and reporting quality of such studies when included in systematic reviews. Evaluative evidence regarding the design and use of our existing Quality Assessment Tool for Studies with Diverse Designs (QATSDD) was synthesised to enhance and refine it for application across health services research. Methods Secondary data were collected through a literature review of all articles identified using Google Scholar that had cited the QATSDD tool from its inception in 2012 to December 2019. First authors of all papers that had cited the QATSDD ( n =197) were also invited to provide further evaluative data via a qualitative online survey. Evaluative findings from the survey and literature review were synthesised narratively and these data used to identify areas requiring refinement. The refined tool was subject to inter-rater reliability, face and content validity analyses. Results Key limitations of the QATSDD tool identified related to a lack of clarity regarding scope of use of the tool and in the ease of application of criteria beyond experimental psychological research. The Quality Appraisal for Diverse Studies (QuADS) tool emerged as a revised tool to address the limitations of the QATSDD. The QuADS tool demonstrated substantial inter-rater reliability (k=0.66), face and content validity for application in systematic reviews with mixed, or multi-methods health services research. Conclusion Our findings highlight the perceived value of appraisal tools to determine the methodological and reporting quality of studies in reviews that include heterogeneous studies. The QuADS tool demonstrates strong reliability and ease of use for application to multi or mixed-methods health services research.
Applying GRADE-CERQual to qualitative evidence synthesis findings: introduction to the series
The GRADE-CERQual (‘Confidence in the Evidence from Reviews of Qualitative research’) approach provides guidance for assessing how much confidence to place in findings from systematic reviews of qualitative research (or qualitative evidence syntheses). The approach has been developed to support the use of findings from qualitative evidence syntheses in decision-making, including guideline development and policy formulation. Confidence in the evidence from qualitative evidence syntheses is an assessment of the extent to which a review finding is a reasonable representation of the phenomenon of interest. CERQual provides a systematic and transparent framework for assessing confidence in individual review findings, based on consideration of four components: (1) methodological limitations, (2) coherence, (3) adequacy of data, and (4) relevance. A fifth component, dissemination (or publication) bias, may also be important and is being explored. As with the GRADE (Grading of Recommendations Assessment, Development, and Evaluation) approach for effectiveness evidence, CERQual suggests summarising evidence in succinct, transparent, and informative Summary of Qualitative Findings tables. These tables are designed to communicate the review findings and the CERQual assessment of confidence in each finding. This article is the first of a seven-part series providing guidance on how to apply the CERQual approach. In this paper, we describe the rationale and conceptual basis for CERQual, the aims of the approach, how the approach was developed, and its main components. We also outline the purpose and structure of this series and discuss the growing role for qualitative evidence in decision-making. Papers 3, 4, 5, 6, and 7 in this series discuss each CERQual component, including the rationale for including the component in the approach, how the component is conceptualised, and how it should be assessed. Paper 2 discusses how to make an overall assessment of confidence in a review finding and how to create a Summary of Qualitative Findings table. The series is intended primarily for those undertaking qualitative evidence syntheses or using their findings in decision-making processes but is also relevant to guideline development agencies, primary qualitative researchers, and implementation scientists and practitioners.
Applying GRADE-CERQual to qualitative evidence synthesis findings—paper 2: how to make an overall CERQual assessment of confidence and create a Summary of Qualitative Findings table
Background The GRADE-CERQual (Confidence in Evidence from Reviews of Qualitative research) approach has been developed by the GRADE (Grading of Recommendations Assessment, Development and Evaluation) Working Group. The approach has been developed to support the use of findings from qualitative evidence syntheses in decision making, including guideline development and policy formulation. CERQual includes four components for assessing how much confidence to place in findings from reviews of qualitative research (also referred to as qualitative evidence syntheses): (1) methodological limitations, (2) coherence, (3) adequacy of data and (4) relevance. This paper is part of a series providing guidance on how to apply CERQual and focuses on making an overall assessment of confidence in a review finding and creating a CERQual Evidence Profile and a CERQual Summary of Qualitative Findings table. Methods We developed this guidance by examining the methods used by other GRADE approaches, gathering feedback from relevant research communities and developing consensus through project group meetings. We then piloted the guidance on several qualitative evidence syntheses before agreeing on the approach. Results Confidence in the evidence is an assessment of the extent to which a review finding is a reasonable representation of the phenomenon of interest. Creating a summary of each review finding and deciding whether or not CERQual should be used are important steps prior to assessing confidence. Confidence should be assessed for each review finding individually, based on the judgements made for each of the four CERQual components. Four levels are used to describe the overall assessment of confidence: high, moderate, low or very low. The overall CERQual assessment for each review finding should be explained in a CERQual Evidence Profile and Summary of Qualitative Findings table. Conclusions Structuring and summarising review findings, assessing confidence in those findings using CERQual and creating a CERQual Evidence Profile and Summary of Qualitative Findings table should be essential components of undertaking qualitative evidence syntheses. This paper describes the end point of a CERQual assessment and should be read in conjunction with the other papers in the series that provide information on assessing individual CERQual components.
Meta-ethnography in healthcare research: a guide to using a meta-ethnographic approach for literature synthesis
Background Qualitative synthesis approaches are increasingly used in healthcare research. One of the most commonly utilised approaches is meta-ethnography. This is a systematic approach which synthesises data from multiple studies to enable new insights into patients’ and healthcare professionals’ experiences and perspectives. Meta-ethnographies can provide important theoretical and conceptual contributions and generate evidence for healthcare practice and policy. However, there is currently a lack of clarity and guidance surrounding the data synthesis stages and process. Method This paper aimed to outline a step-by-step method for conducting a meta-ethnography with illustrative examples. Results A practical step-by-step guide for conducting meta-ethnography based on the original seven steps as developed by Noblit & Hare (Meta-ethnography: Synthesizing qualitative studies.,1998) is presented. The stages include getting started, deciding what is relevant to the initial interest, reading the studies, determining how the studies are related, translating the studies into one another, synthesising the translations and expressing the synthesis. We have incorporated adaptations and developments from recent publications. Annotations based on a previous meta-ethnography are provided. These are particularly detailed for stages 4–6, as these are often described as being the most challenging to conduct, but with the most limited amount of guidance available. Conclusion Meta-ethnographic synthesis is an important and increasingly used tool in healthcare research, which can be used to inform policy and practice. The guide presented clarifies how the stages and processes involved in conducting a meta-synthesis can be operationalised.
Planning for and Assessing Rigor in Rapid Qualitative Analysis (PARRQA): a consensus-based framework for designing, conducting, and reporting
Background The use of rapid qualitative methods has increased substantially over the past decade in quality improvement and health services research. These methods have gained traction in implementation research and practice, wherein real-time adjustments are often made to optimize processes and outcomes. This brisk increase begs the questions: what does rigor entail in projects that use rapid qualitative analysis (RQA)? How do we define a pragmatic framework to help research teams design and conduct rigorous and valid rapid qualitative projects? How can authors articulate rigor in their methods descriptions? Lastly, how can reviewers evaluate the rigor of rapid qualitative projects?. Methods A team of seven interdisciplinary qualitative methods experts developed a framework for ensuring rigor and validity in RQA and methods suitable for this analytic approach. We conducted a qualitative evidence synthesis to identify gaps in the literature and then drew upon literature, standard procedures within our teams, and a repository of rapid qualitative training materials to create a planning and reporting framework. We iteratively refined this framework through 11 group working meetings (60-90 minutes each) over the course of one year and invited feedback on items to ensure their completeness, clarity, and comprehensibility. Results The Planning for and Assessing Rigor in Rapid Qualitative Analysis (PARRQA) framework is organized progressively across phases from design to dissemination, as follows: 1) rigorous design (rationale and staffing), 2) semi-structured data collection (pilot and planning), 3) RQA: summary template development (accuracy and calibration), 4) RQA: matrix analysis (matrices), and 5) rapid qualitative data synthesis. Eighteen recommendations across these sections specify best practices for rigor and validity. Conclusions Rapid qualitative methods play a central role in implementation evaluations, with the potential to yield prompt information and insights about context, processes, and relationships. However, guidance on how to assess rigor is nascent. The PARRQA framework enhances the literature by offering criteria to ensure appropriate planning for and assessment of rigor in projects that involve RQA. This framework provides a consensus-based resource to support high-level qualitative methodological rigor in implementation science.
Rapid versus traditional qualitative analysis using the Consolidated Framework for Implementation Research (CFIR)
Background Qualitative approaches, alone or in mixed methods, are prominent within implementation science. However, traditional qualitative approaches are resource intensive, which has led to the development of rapid qualitative approaches. Published rapid approaches are often inductive in nature and rely on transcripts of interviews. We describe a deductive rapid analysis approach using the Consolidated Framework for Implementation Research (CFIR) that uses notes and audio recordings. This paper compares our rapid versus traditional deductive CFIR approach. Methods Semi-structured interviews were conducted for two cohorts of the Veterans Health Administration (VHA) Diffusion of Excellence (DoE). The CFIR guided data collection and analysis. In cohort A, we used our traditional CFIR-based deductive analysis approach (directed content analysis), where two analysts completed independent in-depth manual coding of interview transcripts using qualitative software. In cohort B, we used our new rapid CFIR-based deductive analysis approach (directed content analysis), where the primary analyst wrote detailed notes during interviews and immediately “coded” notes into a MS Excel CFIR construct by facility matrix; a secondary analyst then listened to audio recordings and edited the matrix. We tracked time for our traditional and rapid deductive CFIR approaches using a spreadsheet and captured transcription costs from invoices. We retrospectively compared our approaches in terms of effectiveness and rigor. Results Cohorts A and B were similar in terms of the amount of data collected. However, our rapid deductive CFIR approach required 409.5 analyst hours compared to 683 h during the traditional deductive CFIR approach. The rapid deductive approach eliminated $7250 in transcription costs. The facility-level analysis phase provided the greatest savings: 14 h/facility for the traditional analysis versus 3.92 h/facility for the rapid analysis. Data interpretation required the same number of hours for both approaches. Conclusion Our rapid deductive CFIR approach was less time intensive and eliminated transcription costs, yet effective in meeting evaluation objectives and establishing rigor. Researchers should consider the following when employing our approach: (1) team expertise in the CFIR and qualitative methods, (2) level of detail needed to meet project aims, (3) mode of data to analyze, and (4) advantages and disadvantages of using the CFIR.