Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
6
result(s) for
"MASNICK, Amy M"
Sort by:
A Model of Scientific Data Reasoning
2022
Data reasoning is an essential component of scientific reasoning, as a component of evidence evaluation. In this paper, we outline a model of scientific data reasoning that describes how data sensemaking underlies data reasoning. Data sensemaking, a relatively automatic process rooted in perceptual mechanisms that summarize large quantities of information in the environment, begins early in development, and is refined with experience, knowledge, and improved strategy use. Summarizing data highlights set properties such as central tendency and variability, and these properties are used to draw inferences from data. However, both data sensemaking and data reasoning are subject to cognitive biases or heuristics that can lead to flawed conclusions. The tools of scientific reasoning, including external representations, scientific hypothesis testing, and drawing probabilistic conclusions, can help reduce the likelihood of such flaws and help improve data reasoning. Although data sensemaking and data reasoning are not supplanted by scientific data reasoning, scientific reasoning skills can be leveraged to improve learning about science and reasoning with data.
Journal Article
Investigating the Development of Data Evaluation: The Role of Data Characteristics
2008
A crucial skill in scientific and everyday reasoning is the ability to interpret data. The present study examined how data features influence data interpretation. In Experiment 1, one hundred and thirty-three 9-year-olds, 12-year-olds, and college students (mean age = 20 years) were shown a series of data sets that varied in the number of observations and the amount of variance between and within observations. Only limited context for the data was provided. In Experiment 2, similar data sets were presented to 101 participants from the same age groups incrementally rather than simultaneously. The results demonstrated that data characteristics affect how children interpret observations, with significant age-related increases in detecting multiple data characteristics, in using them in combination, and in explicit verbal descriptions of data interpretations.
Journal Article
Making numbers out of magnitudes
2008
We argue that number principles may be learnable instead of innate, by suggesting that children acquire probabilistically true number concepts rather than algorithms. We also suggest that non-propositional representational formats (e.g., mental models) may implicitly provide information that supports the induction of numerical principles. Given probabilistically true number concepts, the problem of the acquisition of mathematical principles is eliminated.
Journal Article
Efficacy of a Metacognitive Writing-to-Learn Exercise in Improving Student Understanding and Performance in an Engineering Statics Course
by
Rich, Jennifer Andrea
,
Masnick, Amy M
,
Goldberg, Saryn R
in
Age differences
,
Chemical engineering
,
Chemistry
2015
Efficacy of a metacognitive writing-to-learn exercise in improving student understanding and performance in an engineering Statics courseOur current work is an extension of the preliminary results of a NSF-funded study thatinvestigates the use of metacognitive writing-to-learn prompts in an engineering Statics course toimprove student understanding and performance. Our methodology was determined by acomprehensive study of literature investigating the use of writing in the science classroom(Beall, 1998; Case & Gunstone, 2003; Case & Marshall, 2004; Driskill, et al., 1998; Hanson &Williams, 2008; Hübner et al., 2006; Jamison, 2000: Nokes et al., 2011).In the second year of the study (2013-2014), we developed a writing prompt that we are findingto be more successful than our previous iterations, both in efficacy for the students and ease ofimplementation for the professor. Specifically, after students solve selected engineeringproblems, they answer short-answer questions to describe any confusion they had about theconcepts or computations required to solve the problem. The professor then demonstrates theproblem solution in class as students correct their own work. Following this, students are askedto revisit and reexamine their conceptual and computational errors via writing in the classroom.We believe that students will gain a more lasting and deeper understanding of the staticsconcepts under examination if students are given an opportunity to reflect in writing on thereason for their mistakes after instructor feedback. Finally, the instructor provides additionalfeedback indicating whether students’ understanding of the reason for their error(s) was accurate.The inclusion of a revision-based writing step is consistent with current research showing thisapproach to be effective in improving student performance (Hübner, et al., 2006; Ionas, et al.,2012; Kagestan & Engelbrecht 2006; Porter & Masingla 2000). The revision step allows studentsto clarify their new understanding of the problem under consideration and also to reassess thereasons for their initial confusion. In this way, it closes the loop for them metacognitively. Ourpreliminary data from this approach, presented at the ASEE 2014 meeting, suggestedimprovement in student performance as seen in their final exam scores, but the sample wassmall, and the effect did not reach signficance. We are therefore implementing the samemethodology during the Fall 2014 semester with a cohort of approximately 60 students in twosections of Statics, taught by the same professor as the previous semester, for a more robust testof the efficacy of the intervention. To rule out the possibility of cohort differences, we are alsocollecting GPA/SAT and prerequisite grades in math and physics for past and current Staticsstudents.For the ASEE 2015 conference, we plan to present the most recent findings of this study. BySpring 2015, we will have a clearer picture of the efficacy of our intervention. If appropriate, wewill then be ready to discuss ways that this writing intervention might be utilized effectively inother engineering education contexts. If we find that the intervention is not effective, we willpresent changes that we plan to implement to improve learning in Statics. ReferencesBeall, H. (1998). Expanding the scope of writing in chemical education. Journal of Science Education and Technology, 7(3), 259-270.Case, J. & Gunstone, R. F. (2003). Approaches to learning in a second year chemical engineering course. International Journal of Science Education, 25(7), 801-819.Case, J. & Marshall, D. (2004). Between deep and surface: procedural approaches to learning in engineering education contexts. Studies in Higher Education, 29(5), 605-615.Driskill, L, Lewis, K., Stearns, J., & Volz, T. (1998). Students’ reasoning and rhetorical knowledge in first-year chemistry. Language and Learning Across the Disciplines, 2(3), 3-24.Hanson, J. H. & Williams, J. M. (2008). Using writing assessments to improve students’ self assessment and communication in an Engineering statics course. Journal of Engineering Education. 97(4), 515-529.Hübner, S., Nückles, M., & Renki, A. (2006). Prompting cognitive and metacognitive processing in writing-to-learn enhances learning outcomes. In R. Sun (Ed.), Proceedings of the 28th Annual Conference of the Cognitive Science Society (pp. 357-362). Mahwah, NJ: Erlbaum.Ionas, I., Cernusca, D., & Collier, H. L. (2012). Prior knowledge influence on self-explanation effectiveness when solving problems: An exploratory study in science learning. International Journal Of Teaching And Learning In Higher Education, 24(3), 349-358.Jamison, R. (2000). Learning the language of mathematics. Learning and Language Across the Disciplines, 4(1), 45-54.Kagesten, O., & Engelbrecht, J. (2006). Supplementary explanations in undergraduate mathematics assessment: A forced formative writing activity. European Journal of Engineering Education, 31(6), 705-715.Nokes, T.J., Hausmann, R. G. M., VanLehn, K., Gershman, S. (2011). Testing the instructional fit hypothesis: The case for self-explanation prompts. Instructional Science 39: 645-666.Porter, M. & Masingla, J. O. (2000). Examining the effects of writing on conceptual and procedural knowledge in calculus. Educational Studies in Mathematics. 42, 165-177.
Conference Proceeding
Using discrete choice experiments to design interventions for heterogeneous preferences: protocol for a pragmatic randomised controlled trial of a preference-informed, heterogeneity-focused, HIV testing offer for high-risk populations
by
Thielman, Nathan M
,
Hobbie, Amy
,
Flaherty, Brian
in
Acquired immune deficiency syndrome
,
AIDS
,
Counseling
2020
IntroductionApproximately one million undiagnosed persons living with HIV in Southern and Eastern Africa need to test for HIV. Novel approaches are necessary to identify HIV testing options that match the heterogeneous testing preferences of high-risk populations. This pragmatic randomised controlled trial (PRCT) will evaluate the efficacy of a preference-informed, heterogeneity-focused HIV counselling and testing (HCT) offer, for improving rates of HIV testing in two high-risk populations.Methods and analysisThe study will be conducted in Moshi, Tanzania. The PRCT will randomise 600 female barworkers and 600 male Kilimanjaro mountain porters across three study arms. All participants will receive an HIV testing offer comprised of four preference-informed testing options, including one ‘common’ option—comprising features that are commonly available in the area and, on average, most preferred among study participants—and three options that are specific to the study arm. Options will be identified using mixed logit and latent class analyses of data from a discrete choice experiment (DCE). Participants in Arm 1 will be offered the common option and three ‘targeted’ options that are predicted to be more preferred than the common option and combine features widely available in the study area. Participants in Arm 2 will be offered the common option and three ‘enhanced’ options, which also include HCT features that are not yet widely available in the study area. Participants in Arm 3, an active control arm, will be offered the common option and three predicted ‘less preferred’ options. The primary outcome will be uptake of HIV testing.Ethics and disseminationEthical approval was obtained from the Duke University Health System IRB, the University of South Carolina IRB, the Ethics Review Committee at Kilimanjaro Christian Medical University College, Tanzania’s National Institute for Medical Research, and the Tanzania Food & Drugs Authority (now Tanzania Medicines & Medical Devices Authority). Findings will be published in peer-reviewed journals. The use of rigorous DCE methods for the preference-based design and tailoring of interventions could lead to novel policy options and implementation science approaches.Trial registration numberNCT02714140.
Journal Article