Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Series TitleSeries Title
-
Reading LevelReading Level
-
YearFrom:-To:
-
More FiltersMore FiltersContent TypeItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
4,506
result(s) for
"Eye-Tracking"
Sort by:
Eye tracking in second language acquisition and bilingualism : a research synthesis and methodological guide
Eye Tracking in Second Language Acquisition and Bilingualism provides foundational knowledge and hands-on advice for designing, conducting, and analysing eye-tracking research in applied linguistics. Godfroid's research synthesis and methodological guide introduces the reader to fundamental facts about eye movements, eye-tracking paradigms for language scientists, data analysis, and the practicalities of building a lab. This indispensable book will appeal to undergraduate students learning principles of experimental design, graduate students developing their theoretical and statistical repertoires, experienced scholars looking to expand their own research, and eye-tracking professionals.
Eye tracking: empirical foundations for a minimal reporting guideline
by
Ettinger, Ulrich
,
Benjamins, Jeroen S
,
Zemblys, Raimondas
in
Empirical Research
,
Eye Movements
,
Eye-Tracking Technology
2023
In this paper, we present a review of how the various aspects of any study using an eye tracker (such as the instrument, methodology, environment, participant, etc.) affect the quality of the recorded eye-tracking data and the obtained eye-movement and gaze measures. We take this review to represent the empirical foundation for reporting guidelines of any study involving an eye tracker. We compare this empirical foundation to five existing reporting guidelines and to a database of 207 published eye-tracking studies. We find that reporting guidelines vary substantially and do not match with actual reporting practices. We end by deriving a minimal, flexible reporting guideline based on empirical research (Section \"An empirically based minimal reporting guideline\").
Journal Article
Quantitative comparison of a mobile, tablet-based eye-tracker and two stationary, video-based eye-trackers
by
Dowiasch, Stefan
,
Thomas, Uwe
,
Bremmer, Frank
in
Adult
,
Behavioral Science and Psychology
,
Cognitive Psychology
2025
The analysis of eye movements is a noninvasive, reliable and fast method to detect and quantify brain (dys)function. Here, we investigated the performance of two novel eye-trackers—the Thomas Oculus Motus-research mobile (TOM-rm) and the TOM-research stationary (TOM-rs)—and compared them with the performance of a well-established video-based eye-tracker, i.e., the EyeLink 1000 Plus (EL). The TOM-rm is a fully integrated, tablet-based mobile device that presents visual stimuli and records head-unrestrained eye movements at 30 Hz without additional infrared illumination. The TOM-rs is a stationary, video-based eye-tracker that records eye movements at either high spatial or high temporal resolution. We compared the performance of all three eye-trackers in two different behavioral tasks: pro- and anti-saccade and free viewing. We collected data from 30 human subjects while running all three eye-tracking devices in parallel. Parameters requiring a high spatial or temporal resolution (e.g., saccade latency or gain), as derived from the data, differed significantly between the EL and the TOM-rm in both tasks. Differences between results derived from the TOM-rs and the EL were most likely due to experimental conditions, which could not be optimized for both systems simultaneously. We conclude that the TOM-rm can be used for measuring basic eye-movement parameters, such as the error rate in a typical pro- and anti-saccade task, or the number and position of fixations in a visual foraging task, reliably at comparably low spatial and temporal resolution. The TOM-rs, on the other hand, can provide high-resolution oculomotor data at least on a par with an established reference system.
Journal Article
Noise estimation for head-mounted 3D binocular eye tracking using Pupil Core eye-tracking goggles
by
Shanidze, Natela M.
,
Velisar, Anca
in
Behavioral Science and Psychology
,
Cognitive Psychology
,
Eye Movements
2024
Head-mounted, video-based eye tracking is becoming increasingly common and has promise in a range of applications. Here, we provide a practical and systematic assessment of the sources of measurement uncertainty for one such device – the Pupil Core – in three eye-tracking domains: (1) the 2D scene camera image; (2) the physical rotation of the eye relative to the scene camera 3D space; and (3) the external projection of the estimated gaze point location onto the target plane or in relation to world coordinates. We also assess eye camera motion during active tasks relative to the eye and the scene camera, an important consideration as the rigid arrangement of eye and scene camera is essential for proper alignment of the detected gaze. We find that eye camera motion, improper gaze point depth estimation, and erroneous eye models can all lead to added noise that must be considered in the experimental design. Further, while calibration accuracy and precision estimates can help assess data quality in the scene camera image, they may not be reflective of errors and variability in gaze point estimation. These findings support the importance of eye model constancy for comparisons across experimental conditions and suggest additional assessments of data reliability may be warranted for experiments that require the gaze point or measure eye movements relative to the external world.
Journal Article
Utilizing Interactive Surfaces to Enhance Learning, Collaboration and Engagement: Insights from Learners’ Gaze and Speech
by
Sharma, Kshitij
,
Leftheriotis, Ioannis
,
Giannakos, Michail
in
Adult
,
collaboration outcome
,
Collaborative learning
2020
Interactive displays are becoming increasingly popular in informal learning environments as an educational technology for improving students’ learning and enhancing their engagement. Interactive displays have the potential to reinforce and maintain collaboration and rich-interaction with the content in a natural and engaging manner. Despite the increased prevalence of interactive displays for learning, there is limited knowledge about how students collaborate in informal settings and how their collaboration around the interactive surfaces influences their learning and engagement. We present a dual eye-tracking study, involving 36 participants, a two-staged within-group experiment was conducted following single-group time series design, involving repeated measurement of participants’ gaze, voice, game-logs and learning gain tests. Various correlation, regression and covariance analyses employed to investigate students’ collaboration, engagement and learning gains during the activity. The results show that collaboratively, pairs who have high gaze similarity have high learning outcomes. Individually, participants spending high proportions of time in acquiring the complementary information from images and textual parts of the learning material attain high learning outcomes. Moreover, the results show that the speech could be an interesting covariate while analyzing the relation between the gaze variables and the learning gains (and task-based performance). We also show that the gaze is an effective proxy to cognitive mechanisms underlying collaboration not only in formal settings but also in informal learning scenarios.
Journal Article
Online eye tracking and real-time sentence processing: On opportunities and efficacy for capturing psycholinguistic effects of different magnitudes and diversity
by
Prystauka, Yanina
,
Rothman, Jason
,
Altmann, Gerry T. M.
in
Behavioral Science and Psychology
,
Cognitive Psychology
,
Eye Movements - physiology
2024
Online research methods have the potential to facilitate equitable accessibility to otherwise-expensive research resources, as well as to more diverse populations and language combinations than currently populate our studies. In psycholinguistics specifically, webcam-based eye tracking is emerging as a powerful online tool capable of capturing sentence processing effects in real time. The present paper asks whether webcam-based eye tracking provides the necessary granularity to replicate effects—crucially both large and small—that tracker-based eye tracking has shown. Using the Gorilla Experiment Builder platform, this study set out to replicate two psycholinguistic effects: a robust one, the verb semantic constraint effect, first reported in Altmann and Kamide,
Cognition 73
(3), 247–264 (
1999
), and a smaller one, the lexical interference effect, first examined by Kukona et al.
Journal of Experimental Psychology: Learning, Memory, and Cognition, 40
(2), 326 (
2014
). Webcam-based eye tracking was able to replicate both effects, thus showing that its functionality is not limited to large effects. Moreover, the paper also reports two approaches to computing statistical power and discusses the differences in their outputs. Beyond discussing several important methodological, theoretical, and practical implications, we offer some further technical details and advice on how to implement webcam-based eye-tracking studies. We believe that the advent of webcam-based eye tracking, at least in respect of the visual world paradigm, will kickstart a new wave of more diverse studies with more diverse populations.
Journal Article
Emotion Recognition Using Eye-Tracking: Taxonomy, Review and Current Challenges
by
Mountstephens, James
,
Teo, Jason
,
Lim, Jia Zheng
in
affective computing
,
Algorithms
,
Artificial intelligence
2020
The ability to detect users’ emotions for the purpose of emotion engineering is currently one of the main endeavors of machine learning in affective computing. Among the more common approaches to emotion detection are methods that rely on electroencephalography (EEG), facial image processing and speech inflections. Although eye-tracking is fast in becoming one of the most commonly used sensor modalities in affective computing, it is still a relatively new approach for emotion detection, especially when it is used exclusively. In this survey paper, we present a review on emotion recognition using eye-tracking technology, including a brief introductory background on emotion modeling, eye-tracking devices and approaches, emotion stimulation methods, the emotional-relevant features extractable from eye-tracking data, and most importantly, a categorical summary and taxonomy of the current literature which relates to emotion recognition using eye-tracking. This review concludes with a discussion on the current open research problems and prospective future research directions that will be beneficial for expanding the body of knowledge in emotion detection using eye-tracking as the primary sensor modality.
Journal Article
ARETT: Augmented Reality Eye Tracking Toolkit for Head Mounted Displays
by
Mukhametov, Sergey
,
Kapp, Sebastian
,
Barz, Michael
in
accuracy
,
Augmented Reality
,
eye tracking
2021
Currently an increasing number of head mounted displays (HMD) for virtual and augmented reality (VR/AR) are equipped with integrated eye trackers. Use cases of these integrated eye trackers include rendering optimization and gaze-based user interaction. In addition, visual attention in VR and AR is interesting for applied research based on eye tracking in cognitive or educational sciences for example. While some research toolkits for VR already exist, only a few target AR scenarios. In this work, we present an open-source eye tracking toolkit for reliable gaze data acquisition in AR based on Unity 3D and the Microsoft HoloLens 2, as well as an R package for seamless data analysis. Furthermore, we evaluate the spatial accuracy and precision of the integrated eye tracker for fixation targets with different distances and angles to the user (n=21). On average, we found that gaze estimates are reported with an angular accuracy of 0.83 degrees and a precision of 0.27 degrees while the user is resting, which is on par with state-of-the-art mobile eye trackers.
Journal Article