Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
6 result(s) for "Milesi, Mikaela"
Sort by:
Co-designing AI-powered learning analytics: bringing students and teachers together
There is a growing interest in involving students and teachers in the design of human-centered Learning Analytics (LA) systems to align them with authentic learning needs. Yet, limited prior research has explored the implications of integrating both students’ and teachers’ perspectives within a structured co-design process. To address this shortcoming in the literature, we report on a study that examined how undergraduate nursing students and teachers co-designed an AI-powered LA system to support post-debriefing reflection on teamwork and communication in the context of healthcare simulation. This qualitative study, using a co-design approach, examined the design process of an LA system from conceptualization to post-use evaluation. The study addressed two key questions: i ) What tensions emerge from the contrasting perspectives of students and teachers in the co-design an AI-powered LA system? and ii ) How do students and teachers perceive their joint participation in the co-design process? Three key design tension themes emerged from the contrasting perspectives of students and teachers: teaching–learning goals tension , privacy–utility tension , and human-AI guidance preferences tension . The collaborative design process revealed mutual benefits: students valued teachers’ guidance in refining ideas and aligning system goals with learning objectives, while teachers, initially cautious about student involvement, came to see co-design as an opportunity to empower students and deepen their own understanding of responsible data use in practice. These findings contribute to the broader understanding of co-design dynamics in educational technology, underscoring the importance of balanced stakeholder involvement in developing practical, context-aware LA systems.
From comic panels to clinical practice: data comics as a learning analytics tool in nursing simulation
In healthcare education, it is important for nursing students to be able to reflect on their performance in high-fidelity clinical simulations in order to develop key skills. Learning Analytics (LA) offers opportunities for data-driven reflection by providing visual representations of educational experiences. While many LA tools rely on data visualisations to communicate insights, these are often difficult for students to interpret, limiting their effectiveness. Despite these challenges, there is limited research exploring alternative and potentially more accessible formats—such as data comics, a narrative visualisation technique that integrates data with the structure of traditional comic strips—to represent and communicate insights from learner data in a more engaging way. This study addresses that gap through a qualitative analysis of nursing students’ perceptions of data comics as reflective tools, focusing on: (i) support for student reflection, (ii) advantages and limitations, and (iii) concerns about their use in healthcare education. Third-year nursing students who participated in a simulation were interviewed and asked to reflect on personalised data comic prototypes generated from their multimodal data using a mix of human input and AI methods. The results indicated that while data comics present an engaging and accessible form of reflective visualisation, considerations need to be made regarding the designs to ensure that they are appropriate for the target audience and do not oversimplify the simulation experience. These findings indicate that data comics should not act as a replacement for conventional visualisations but rather serve as supplementary material to communicate contextual information or aid in interpretation of visualisations.
TeamVision: An AI-powered Learning Analytics System for Supporting Reflection in Team-based Healthcare Simulation
Healthcare simulations help learners develop teamwork and clinical skills in a risk-free setting, promoting reflection on real-world practices through structured debriefs. However, despite video's potential, it is hard to use, leaving a gap in providing concise, data-driven summaries for supporting effective debriefing. Addressing this, we present TeamVision, an AI-powered multimodal learning analytics (MMLA) system that captures voice presence, automated transcriptions, body rotation, and positioning data, offering educators a dashboard to guide debriefs immediately after simulations. We conducted an in-the-wild study with 56 teams (221 students) and recorded debriefs led by six teachers using TeamVision. Follow-up interviews with 15 students and five teachers explored perceptions of its usefulness, accuracy, and trustworthiness. This paper examines: i) how TeamVision was used in debriefing, ii) what educators found valuable and challenging, and iii) perceptions of its effectiveness. Results suggest TeamVision enables flexible debriefing and highlights the challenges and implications of using AI-powered systems in healthcare simulation.
Relational AI in Education: Reciprocity, Participatory Design, and Indigenous Worldviews
Education is not merely the transmission of information or the optimisation of individual performance; it is a fundamentally social, constructive, and relational practice. However, recent advances in generative artificial intelligence (GenAI) increasingly emphasise efficiency, automation, and individualised assistance, risking the weakening of relational learning processes. Despite growing adoption, AI in education (AIED) research has yet to fully articulate how AI can be designed in ways that sustain the social and ecological relationships through which learning occurs. In this paper, we re-centre education as relational and frame learner-AI interactions as context-specific relationships with clearly defined purposes and boundaries, rather than positioning them as substitutes for, or replacements of, human interaction. Grounded in participatory design practices and inspired by Indigenous worldviews (including Aboriginal Australian, Native American, and Mesoamerican traditions) that foreground reciprocity and relational accountability, we argue that meaningful educational AI should support learning with others rather than replace them. We advance this perspective by: i) conceptualising AIED as a relational design problem grounded in reciprocity; ii) articulating key tensions introduced by GenAI in education; and iii) outlining design directions that expand the AIED design space toward reciprocity, including when not to use AI, how to define pedagogical boundaries, and how to support responsible uses of AIED innovations that sustain communities and natural environments.
Chatting with a Learning Analytics Dashboard: The Role of Generative AI Literacy on Learner Interaction with Conventional and Scaffolding Chatbots
Learning analytics dashboards (LADs) simplify complex learner data into accessible visualisations, providing actionable insights for educators and students. However, their educational effectiveness has not always matched the sophistication of the technology behind them. Explanatory and interactive LADs, enhanced by generative AI (GenAI) chatbots, hold promise by enabling dynamic, dialogue-based interactions with data visualisations and offering personalised feedback through text. Yet, the effectiveness of these tools may be limited by learners' varying levels of GenAI literacy, a factor that remains underexplored in current research. This study investigates the role of GenAI literacy in learner interactions with conventional (reactive) versus scaffolding (proactive) chatbot-assisted LADs. Through a comparative analysis of 81 participants, we examine how GenAI literacy is associated with learners' ability to interpret complex visualisations and their cognitive processes during interactions with chatbot-assisted LADs. Results show that while both chatbots significantly improved learner comprehension, those with higher GenAI literacy benefited the most, particularly with conventional chatbots, demonstrating diverse prompting strategies. Findings highlight the importance of considering learners' GenAI literacy when integrating GenAI chatbots in LADs and educational technologies. Incorporating scaffolding techniques within GenAI chatbots can be an effective strategy, offering a more guided experience that reduces reliance on learners' GenAI literacy.
The Effects of Generative AI Agents and Scaffolding on Enhancing Students' Comprehension of Visual Learning Analytics
Visual learning analytics (VLA) is becoming increasingly adopted in educational technologies and learning analytics dashboards to convey critical insights to students and educators. Yet many students experienced difficulties in comprehending complex VLA due to their limited data visualisation literacy. While conventional scaffolding approaches like data storytelling have shown effectiveness in enhancing students' comprehension of VLA, these approaches remain difficult to scale and adapt to individual learning needs. Generative AI (GenAI) technologies, especially conversational agents, offer potential solutions by providing personalised and dynamic support to enhance students' comprehension of VLA. This study investigates the effectiveness of GenAI agents, particularly when integrated with scaffolding techniques, in improving students' comprehension of VLA. A randomised controlled trial was conducted with 117 higher education students to compare the effects of two types of GenAI agents: passive agents, which respond to student queries, and proactive agents, which utilise scaffolding questions, against standalone scaffolding in a VLA comprehension task. The results show that passive agents yield comparable improvements to standalone scaffolding both during and after the intervention. Notably, proactive GenAI agents significantly enhance students' VLA comprehension compared to both passive agents and standalone scaffolding, with these benefits persisting beyond the intervention. These findings suggest that integrating GenAI agents with scaffolding can have lasting positive effects on students' comprehension skills and support genuine learning.