Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
32,423 result(s) for "Learning analytics"
Sort by:
A Real-Time Learning Analytics Dashboard for Automatic Detection of Online Learners’ Affective States
Students’ affective states describe their engagement, concentration, attitude, motivation, happiness, sadness, frustration, off-task behavior, and confusion level in learning. In online learning, students’ affective states are determinative of the learning quality. However, measuring various affective states and what influences them is exceedingly challenging for the lecturer without having real interaction with the students. Existing studies primarily use self-reported data to understand students’ affective states, while this paper presents a novel learning analytics system called MOEMO (Motion and Emotion) that could measure online learners’ affective states of engagement and concentration using emotion data. Therefore, the novelty of this research is to visualize online learners’ affective states on lecturers’ screens in real-time using an automated emotion detection process. In real-time and offline, the system extracts emotion data by analyzing facial features from the lecture videos captured by the typical built-in web camera of a laptop computer. The system determines online learners’ five types of engagement (“strong engagement”, “high engagement”, “medium engagement”, “low engagement”, and “disengagement”) and two types of concentration levels (“focused” and “distracted”). Furthermore, the dashboard is designed to provide insight into students’ emotional states, the clusters of engaged and disengaged students’, assistance with intervention, create an after-class summary report, and configure the automation parameters to adapt to the study environment.
A large-scale implementation of predictive learning analytics in higher education: the teacher's role and perspective
By collecting longitudinal learner and learning data from a range of resources, predictive learning analytics (PLA) are used to identify learners who may not complete a course, typically described as being at risk. Mixed effects are observed as to how teachers perceive, use, and interpret PLA data, necessitating further research in this direction. The aim of this study is to evaluate whether providing teachers in a distance learning higher education institution with PLA data predicts students' performance and empowers teachers to identify and assist students at risk. Using principles of Technology Acceptance and Academic Resistance models, a university-wide, multi-methods study with 59 teachers, nine courses, and 1325 students revealed that teachers can positively affect students' performance when engaged with PLA. Follow-up semi-structured interviews illuminated teachers' actual uses of the predictive data and revealed its impact on teaching practices and intervention strategies to support students at risk.
An artificial intelligence-driven learning analytics method to examine the collaborative problem-solving process from the complex adaptive systems perspective
Collaborative problem solving (CPS) enables student groups to complete learning tasks, construct knowledge, and solve problems. Previous research has argued the importance of examining the complexity of CPS, including its multimodality, dynamics, and synergy from the complex adaptive systems perspective. However, there is limited empirical research examining the adaptive and temporal characteristics of CPS, which may have led to an oversimplified representation of the real complexity of the CPS process. To expand our understanding of the nature of CPS in online interaction settings, the present research collected multimodal process and performance data (i.e., speech, computer screen recordings, concept map data) and proposed a three-layered analytical framework that integrated AI algorithms with learning analytics to analyze the regularity of groups’ collaboration patterns. The results surfaced three types of collaborative patterns in groups, namely the behaviour-oriented collaborative pattern (Type 1) associated with medium-level performance, the communication-behaviour-synergistic collaborative pattern (Type 2) associated with high-level performance, and the communication-oriented collaborative pattern (Type 3) associated with low-level performance. This research further highlighted the multimodal, dynamic, and synergistic characteristics of groups’ collaborative patterns to explain the emergence of an adaptive, self-organizing system during the CPS process. According to the empirical research results, theoretical, pedagogical, and analytical implications were discussed to guide the future research and practice of CPS.
Predict or describe? How learning analytics dashboard design influences motivation and statistics anxiety in an online statistics course
Based on the achievement goal theory, this experimental study explored the influence of predictive and descriptive learning analytics dashboards on graduate students’ motivation and statistics anxiety in an online graduate-level statistics course. Participants were randomly assigned into one of three groups: (a) predictive dashboard, (b) descriptive dashboard, or (c) control (i.e., no dashboard). Measures of motivation and statistical anxiety were collected in the beginning and the end of the semester via the Motivated Strategies for Learning Questionnaire and Statistical Anxiety Rating Scale. Individual semi-structured interviews were used to understand learners’ perceptions of the course and whether the use of the dashboards influenced the meaning of their learning experiences. Results indicate that, compared to the control group, the predictive dashboard significantly reduced learners’ interpretation anxiety and had an effect on intrinsic goal orientation that depended on learners’ lower or higher initial levels of intrinsic goal orientation. In comparison to the control group, both predictive and descriptive dashboards reduced worth of anxiety (negative attitudes towards statistics) for learners who started the course with higher levels of worth anxiety. Thematic analysis revealed that learners who adopted a more performance-avoidance goal orientation approach demonstrated higher levels of anxiety regardless of the dashboard used.
Multimodal learning analytics of collaborative patterns during pair programming in higher education
Pair programming (PP), as a mode of collaborative problem solving (CPS) in computer programming education, asks two students work in a pair to co-construct knowledge and solve problems. Considering the complex multimodality of pair programming caused by students’ discourses, behaviors, and socio-emotions, it is of critical importance to examine their collaborative patterns from a holistic, multimodal, dynamic perspective. But there is a lack of research investigating the collaborative patterns generated by the multimodality. This research applied multimodal learning analytics (MMLA) to collect 19 undergraduate student pairs’ multimodal process and products data to examine different collaborative patterns based on the quantitative, structural, and transitional characteristics. The results revealed four collaborative patterns (i.e., a consensus-achieved pattern, an argumentation-driven pattern, an individual-oriented pattern, and a trial-and-error pattern), associated with different levels of process and summative performances. Theoretical, pedagogical, and analytical implications were provided to guide the future research and practice.
The effect of providing learning analytics on student behaviour and performance in programming: a randomised controlled experiment
We use a randomised experiment to study the effect of offering half of 556 freshman students a learning analytics dashboard and a weekly email with a link to their dashboard, on student behaviour in the online environment and final exam performance. The dashboard shows their online progress in the learning management systems, their predicted chance of passing, their predicted grade and their online intermediate performance compared with the total cohort. The email with dashboard access, as well as dashboard use, has positive effects on student behaviour in the online environment, but no effects are found on student performance in the final exam of the programming course. However, we do find differential effects by specialisation and student characteristics.
The use of learning analytics and the potential risk of harm for K-12 students participating in digital learning environments
This paper is in response to the manuscript titled “Ethical oversight of student data in learning analytics: A typology derived from a cross-continental, cross-institutional perspective” (Willis et al. in Educ Technol Res Dev 66(4):1029–1049, 2016). The response is from a K-12 educational environment perspective. Willis, Slade, and Prinsloo’s typology of different ethical approaches to learning analytics adds value to the special issue topic of shifting to digital as it provides different ways to view learning analytics as well as the type of approval(s) possibly needed for each view. K-12 educational institutions can utilize the manuscript as a starting place for review of their ethical oversights when analyzing student data as more schools are shifting to digital. A limitation of Willis, Slade, and Prinsloo’s manuscript in supporting the shift to digital is that the manuscript was published before the overwhelming shift to digital was mandated as a result of Covid-19. Future work related to the manuscript and with a focus on the K-12 educational environment could include K-12 education agencies deriving a K-12 specific typology from a review of ethical oversight protocols or analyzing the effects of the shift to digital in K-12 on the original typology. During the 2020 pandemic, K-12 schools in the United States moved to digital learning quickly and in large numbers. This move has resulted in a wealth of digital learning analytics available to be used in improving student learning outcomes. However, local education agencies must first make sure they have a clear framework for oversight of how digital learning analytics will be used. K-12 students are at risk of harm if local education agencies do not stop and carefully reflect on the potential risks to students resulting from decisions made as a result of using digital learning analytics.
Wild brooms and learning analytics
In this commentary we present an analogy between Johann Wolfgang Von Goethe’s classic poem, The Sorcerer’s Apprentice, and institutional learning analytics. In doing so, we hope to provoke institutions with a simple heuristic when considering their learning analytics initiatives. They might ask themselves, “Are we behaving like the sorcerer’s apprentice?” This would be characterized by initiatives lacking faculty involvement, and we argue that when initiatives fit this pattern, they also lack consideration of their potential hazards, and are likely to fail. We join others in advocating for institutions to, instead, create ecosystems that enable faculty leadership in institutional learning analytics efforts.
Personalized learning analytics intervention approach for enhancing student learning achievement and behavioral engagement in blended learning
Abstract The application of student interaction data is a promising field for blended learning (BL), which combines conventional face-to-face and online learning activities. However, the application of online learning technologies in BL settings is particularly challenging for students with lower self-regulatory abilities. In this study, a personalized learning analytics (LA) intervention approach that incorporates ebook and recommendation systems is proposed. The proposed approach provides students with actionable feedback regarding personalized remedial actions as the intervention to help them to strategically engage in the use of the ebook system and avoid academic failure when engaged in BL. A quasi-experiment was conducted to examine two classes of an undergraduate course that implemented a conventional BL model. The experimental group comprised 45 students from one class who learned using the proposed approach and received personalized intervention, whereas the control group comprised 42 students from the other class who learned using the conventional BL approach without receiving personalized intervention. The experimental results indicated that the proposed approach can improve students’ learning achievements and behavioral engagement in BL. The findings provide pedagogical insights into the application of LA intervention with actionable feedback in BL environments.
A Learning Analytics Framework Based on Human-Centered Artificial Intelligence for Identifying the Optimal Learning Strategy to Intervene in Learning Behavior
Big data in education promotes access to the analysis of learning behavior, yielding many valuable analysis results. However, with obscure and insufficient guidelines commonly followed when applying the analysis results, it is difficult to translate information knowledge into actionable strategies for educational practices. This study aimed to solve this problem by utilizing the learning analytics (LA) framework. We proposed a learning analytics framework based on human-centered Artificial Intelligence (AI) and emphasized its analysis result application step, highlighting the function of this step to transform the analysis results into the most suitable application strategy. To this end, we first integrated evidence-driven education for precise AI analytics and application, which is one of the core ideas of human-centered AI (HAI), into the framework design for its analysis result application step. In addition, a cognitive load test was included in the design. Second, to verify the effectiveness of the proposed framework and application strategy, two independent experiments were carried out, while machine learning and statistical data analysis tools were used to analyze the emerging data. Finally, the results of the first experiment revealed a learning strategy that best matched the analysis results through the application step in the framework. Further, we conclude that students who applied the learning strategy achieved better learning results in the second experiment. Specifically, the second experimental results also show that there was no burden on cognitive load for the students who applied the learning strategy, in comparison with those who did not.