Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Series TitleSeries Title
-
Reading LevelReading Level
-
YearFrom:-To:
-
More FiltersMore FiltersContent TypeItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
81,903
result(s) for
"EVALUATION DESIGN"
Sort by:
User-centered evaluation of visual analytics
\"Visual analytics has come a long way since its inception in 2005. The amount of data in the world today has increased significantly and experts in many domains are struggling to make sense of their data. This book describes the efforts that go into analysis, including critical thinking, sensemaking, and various analytics techniques learned from the intelligence community. Support for these components is needed in order to provide the most utility for the expert users\"--Page 4 of cover.
What Is Design Thinking and Why Is It Important?
2012
Design thinking is generally defined as an analytic and creative process that engages a person in opportunities to experiment, create and prototype models, gather feedback, and redesign. Several characteristics (e.g., visualization, creativity) that a good design thinker should possess have been identified from the literature. The primary purpose of this article is to summarize and synthesize the research on design thinking to (a) better understand its characteristics and processes, as well as the differences between novice and expert design thinkers, and (b) apply the findings from the literature regarding the application of design thinking to our educational system. The authors' overarching goal is to identify the features and characteristics of design thinking and discuss its importance in promoting students' problem-solving skills in the 21st century.
Journal Article
Evaluating clinical and public health interventions : a practical guide to study design and statistics
\"Whether you are evaluating the effectiveness of a drug, a medical device, a behavioral intervention, a community mobilization, or even a new law, this is the book for you. Written in plain language, it simplifies the process of designing interventions, analyzing the data, and publishing the results. Because the choice of research design depends on the nature of the intervention, the book covers randomized and nonrandomized designs, prospective and retrospective studies, planned clinical trials and observational studies. In addition to reviewing standard statistical analysis, the book has easy-to-follow explanations of cutting edge techniques for evaluating interventions, including propensity score analysis, instrumental variable analysis, interrupted time series analysis and sensitivity analysis. All techniques are illustrated with up-to-date examples from medical and public health literature. This will be essential reading for a wide range of healthcare professionals involved in research as well as those more specifically interested in public health issues and epidemiology\"--Provided by publisher.
Propensity Score Matching
by
Peikes, Deborah N
,
Moreno, Lorenzo
,
Orzol, Sean Michael
in
Applications
,
Case studies
,
Case study
2008
Over the past 25 years, evaluators of social programs have searched for nonexperimental methods that can substitute effectively for experimental ones. Recently, the spotlight has focused on one method, propensity score matching (PSM), as the suggested approach for evaluating employment and education programs. We present a case study of our experience using PSM, under seemingly ideal circumstances, for the evaluation of the State Partnership Initiative employment promotion program. Despite ideal conditions and the passing of statistical tests suggesting that the matching procedure had worked, we find that PSM produced incorrect impact estimates when compared with a randomized design. Based on this experience, we caution practitioners about the risks of implementing PSM-based designs.
Journal Article
A Taxonomy of Evaluation Methods for Information Systems Artifacts
by
Akoka, Jacky
,
Prat, Nicolas
,
Comyn-Wattiau, Isabelle
in
artifact evaluation
,
Benchmarks
,
Classification
2015
Artifacts, such as software systems, pervade organizations and society. In the field of information systems (IS) they form the core of research. The evaluation of IS artifacts thus represents a major issue. Although IS research paradigms are increasingly intertwined, building and evaluating artifacts has traditionally been the purview of design science research (DSR). DSR in IS has not reached maturity yet. This is particularly true of artifact evaluation. This paper investigates the \"what\" and the \"how\" of IS artifact evaluation: what are the objects and criteria of evaluation, the methods for evaluating the criteria, and the relationships between the \"what\" and the \"how\" of evaluation? To answer these questions, we develop a taxonomy of evaluation methods for IS artifacts. With this taxonomy, we analyze IS artifact evaluation practice, as reflected by ten years of DSR publications in the basket of journals of the Association for Information Systems (AIS). This research brings to light important relationships between the dimensions of IS artifact evaluation, and identifies seven typical evaluation patterns: demonstration; simulation- and metric-based benchmarking of artifacts; practice-based evaluation of effectiveness; simulation- and metric-based absolute evaluation of artifacts; practice-based evaluation of usefulness or ease of use; laboratory, student-based evaluation of usefulness; and algorithmic complexity analysis. This study also reveals a focus of artifact evaluation practice on a few criteria. Beyond immediate usefulness, IS researchers are urged to investigate ways of evaluating the long-term organizational impact and the societal impact of artifacts.
Journal Article
Analyzing Design Review Conversations
by
Adams, Robin s
,
Siddiqui, Junaid
,
Buzzanell, Patrice
in
Art & Art History
,
Communication in design
,
Communication in engineering design
2016,2015
Design is ubiquitous. Speaking across disciplines, it is a way of thinking that involves dealing with complex, open-ended, and contextualized problems that embody the ambiguities and contradictions in everyday life. It has become a part of pre-college education standards, is integral to how college prepares students for the future, and is playing a lead role in shaping a global innovation imperative.Efforts to advance design thinking, learning, and teaching have been the focus of the Design Thinking Research Symposium (DTRS) series. A unique feature of this series is a shared dataset in which leading design researchers globally are invited to apply their specific expertise to the dataset and bring their disciplinary interests in conversation with each other to bring together multiple facets of design thinking and catalyze new ways for teaching design thinking.Analyzing Design Review Conversations is organized around this shared dataset of conversations between those who give and those who receive feedback, guidance, or critique during a design review event. Design review conversations are a common and prevalent practice for helping designers develop design thinking expertise, although the structure and content of these reviews vary significantly. They make the design thinking of design coaches (instructors, experts, peers, and community and industry stakeholders) and design students visible. During a design review, coaches notice problematic and promising aspects of a designer's work. In this way, design students are supported in revisiting and critically evaluating their design rationales, and making sense of a design review experience in ways that allow them to construct their design thinking repertoire and evolving design identity.
Action Design Research
by
Sein, Maung K.
,
Purao, Sandeep
,
Henfridsson, Ola
in
Action design research
,
Action research
,
Automobile industry
2011
Design research (DR) positions information technology artifacts at the core of the Information Systems discipline. However, dominant DR thinking takes a technological view of the IT artifact, paying scant attention to its shaping by the organizational context. Consequently, existing DR methods focus on building the artifact and relegate evaluation to a subsequent and separate phase. They value technological rigor at the cost of organizational relevance, and fail to recognize that the artifact emerges from interaction with the organizational context even when its initial design is guided by the researchers' intent. We propose action design research (ADR) as a new DR method to address this problem. ADR reflects the premise that IT artifacts are ensembles shaped by the organizational context during development and use. The method conceptualizes the research process as containing the inseparable and inherently interwoven activities of building the IT artifact, intervening in the organization, and evaluating it concurrently. The essay describes the stages of ADR and associated principles that encapsulate its underlying beliefs and values. We illustrate ADR through a case of competence management at Volvo IT.
Journal Article