Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
8,479
result(s) for
"Head movement"
Sort by:
Quantifying Cervical Rotation Smoothness: Exploring Various Jerk Metrics and Test-Retest Reliability of Jerk, Range of Motion, and Head Repositioning Accuracy
by
Tróndarson, Tróndur Fríði
,
McPhee Christensen, Steffan Wittrup
,
Martínez-Echevarría, Diego
in
Accuracy
,
Adult
,
Biomechanical Phenomena
2025
The assessment of smoothness, range of motion (ROM), and head repositioning accuracy (HRA) has gained attention in identifying sensorimotor impairments. Uncertainty persists on the approach for acquiring reliable measures, including choice of smoothness metric, normalization factors, and the required number of measurements for reliable results. This study aimed to address this uncertainty. Thirty healthy participants were included in this single-session randomized cross-over study. The experiment consisted of two parts. One focused on the test–retest assessment of head ROM into right rotation to the end of range from a neutral position using a self-selected movement speed and the HRA when returning to the start-position. In the other part, participants repeated the previous tasks and performed head rotations at slower and faster speeds than their self-selected pace and to the beat of a metronome. All tasks were repeated ten times. For the test–retest, the inter-class-correlation (ICC) values for ROM were between 0.84–0.91, 0.20–0.31 for HRA, and 0.65–0.90 for jerk for 1–10 repetitions. Normalizing jerk through vmean and vpeak had similar variability and appeared equally valid for our data. However, normalizing by vmax ensures desirable properties in the smoothness metric. Lower variability was observed when standardizing movements using a metronome. Based on test–retest findings, three repetitions are recommended, as ICC values show marginal improvement beyond 2–3 repetitions, providing limited additional value.
Journal Article
How robust are wearable eye trackers to slow and fast head and body movements?
by
Niehorster, Diederick C.
,
Hooge, Ignace T. C.
,
Benjamins, Jeroen S.
in
Behavioral Science and Psychology
,
Cognitive Psychology
,
Eye Movement Measurements
2023
How well can modern wearable eye trackers cope with head and body movement? To investigate this question, we asked four participants to stand still, walk, skip, and jump while fixating a static physical target in space. We did this for six different eye trackers. All the eye trackers were capable of recording gaze during the most dynamic episodes (skipping and jumping). The accuracy became worse as movement got wilder. During skipping and jumping, the biggest error was 5.8
∘
. However, most errors were smaller than 3
∘
. We discuss the implications of decreased accuracy in the context of different research scenarios.
Journal Article
Conservative and disruptive modes of adolescent change in human brain functional connectivity
by
Vertes, Petra E.
,
Seidlitz, Jakob
,
Vaghi, Matilde M.
in
Adolescent
,
Adolescent Development - physiology
,
Adolescents
2020
Adolescent changes in human brain function are not entirely understood. Here, we used multiecho functional MRI (fMRI) to measure developmental change in functional connectivity (FC) of resting-state oscillations between pairs of 330 cortical regions and 16 subcortical regions in 298 healthy adolescents scanned 520 times. Participants were aged 14 to 26 y and were scanned on 1 to 3 occasions at least 6 mo apart. We found 2 distinct modes of age-related change in FC: “conservative” and “disruptive.” Conservative development was characteristic of primary cortex, which was strongly connected at 14 y and became even more connected in the period from 14 to 26 y. Disruptive development was characteristic of association cortex and subcortical regions, where connectivity was remodeled: connections that were weak at 14 y became stronger during adolescence, and connections that were strong at 14 y became weaker. These modes of development were quantified using the maturational index (MI), estimated as Spearman’s correlation between edgewise baseline FC (at 14 y, FC14) and adolescent change in FC (ΔFC14−26), at each region. Disruptive systems (with negative MI) were activated by social cognition and autobiographical memory tasks in prior fMRI data and significantly colocated with prior maps of aerobic glycolysis (AG), AG-related gene expression, postnatal cortical surface expansion, and adolescent shrinkage of cortical thickness. The presence of these 2 modes of development was robust to numerous sensitivity analyses. We conclude that human brain organization is disrupted during adolescence by remodeling of FC between association cortical and subcortical areas.
Journal Article
Real-world visual search goes beyond eye movements: Active searchers select 3D scene viewpoints too
2025
Visual search is a ubiquitous task; people search for objects on a daily basis. However, the majority of the existing visual search literature focuses on passive search on a 2D computer screen, a far cry from emulating a real-world environment. Search is a real-world task that involves active observation. Search targets may be occluded, completely out of the observer’s line of sight, or oriented in unconventional ways. This is typically mitigated by actively selecting viewpoints, an important aspect of search behaviour with limited scope on a computer screen. Our goal was to explore viewpoint selection in active visual search. Subject eye and head movements were tracked as they moved freely while searching for toy objects in a controlled 3-dimensional environment, yielding the first such record of search-driven viewpoint selection. We found that subjects utilized their full range of eye and head motion to move from viewpoint to viewpoint, apparently employing a variety of objectives including changing viewing height and pose depending on object 3D pose. Subjects were also adept at selecting unobstructed views to search through otherwise occluded areas with objects. Furthermore, subjects completed the search task with high accuracy, even with no training on the environment. Although no learning was found in terms of accuracy over the duration of the experiment, increases in efficiency were found for other metrics such as response time, number of fixations, and distance travelled, particularly in target present trials where the target was not visible from the starting location. These results paint the story of a visual system that selects and moves to useful and informative views to facilitate the successful execution of an active visual search task, and stresses the significance of active vision research in understanding how vision is used in naturalistic environments.
Journal Article
Head movements affect skill acquisition for ball trapping in blind football
2024
Blind football players use head movements to accurately identify sound location when trapping a ball. Accurate sound localization is likely important for motor learning of ball trapping in blind football. However, whether head movements affect the acquisition of ball-trapping skills remains unclear. Therefore, this study examined the effect of head movements on skill acquisition during ball trapping. Overall, 20 sighted male college students were recruited and assigned to one of the following two groups: the conventional training group, where they were instructed to move leftward and rightward to align their body with the ball’s trajectory, and the head-movement-focused group, where they were instructed to follow the ball with their faces until the ball touched their feet, in addition to the conventional training instructions. Both groups underwent a 2-day training for ball trapping according to the specific instructions. The head-movement-focused group showed a decrease in errors in ball trapping at near distances and with larger downward head rotations in the sagittal plane compared to the conventional training group, indicating that during the skill acquisition training for ball trapping, the sound source can be localized more accurately using larger head rotations toward the ball. These results may help beginner-level players acquire better precision in their movements while playing blind football.
Journal Article
A dynamic sequence of visual processing initiated by gaze shifts
by
Leonard, Emmalyn S. P.
,
Abe, Elliott T. T.
,
Mitchell, Jude F.
in
631/378/2613/1875
,
631/378/2617/1795
,
631/378/3917
2023
Animals move their head and eyes as they explore the visual scene. Neural correlates of these movements have been found in rodent primary visual cortex (V1), but their sources and computational roles are unclear. We addressed this by combining head and eye movement measurements with neural recordings in freely moving mice. V1 neurons responded primarily to gaze shifts, where head movements are accompanied by saccadic eye movements, rather than to head movements where compensatory eye movements stabilize gaze. A variety of activity patterns followed gaze shifts and together these formed a temporal sequence that was absent in darkness. Gaze-shift responses resembled those evoked by sequentially flashed stimuli, suggesting a large component corresponds to onset of new visual input. Notably, neurons responded in a sequence that matches their spatial frequency bias, consistent with coarse-to-fine processing. Recordings in freely gazing marmosets revealed a similar sequence following saccades, also aligned to spatial frequency preference. Our results demonstrate that active vision in both mice and marmosets consists of a dynamic temporal sequence of neural activity associated with visual sampling.
Parker et al. recorded neural activity in V1 of freely moving mice and freely gazing marmosets. In both species, neurons respond to gaze shifts in a temporal sequence, such that new visual input is processed in a ‘coarse’ to ‘fine’ manner.
Journal Article
The Impact of Virtual Reality Content Characteristics on Cybersickness and Head Movement Patterns
2025
Virtual reality (VR) technology has gained popularity across various fields; however, its use often induces cybersickness, characterized by symptoms such as dizziness, nausea, and eye strain. This study investigated the differences in cybersickness levels and head movement patterns under three distinct VR viewing conditions: dynamic VR (DVR), static VR (SVR), and a control condition (CON) using a simulator. Thirty healthy adults participated, and their head movements were recorded using the Meta Quest 2 VR headset and analyzed using Python. The Virtual Reality Sickness Questionnaire (VRSQ) assessed subjective cybersickness levels. The results revealed that the SVR condition induced the highest VRSQ scores (M = 58.057), indicating the most severe cybersickness symptoms, while the DVR condition elicited significantly higher values in head movement variables, particularly in the coefficient of variation (CV) and integral values of head position along the vertical axis, and mean velocity (p < 0.05). These findings suggest that VR content characteristics directly influence users’ head movement patterns, closely related to cybersickness occurrence and severity. This study highlights the importance of analyzing head movement patterns in cybersickness research and provides insights for VR content design.
Journal Article
Large eye–head gaze shifts measured with a wearable eye tracker and an industrial camera
by
Niehorster, Diederick C.
,
Hooge, Ignace T. C.
,
Hessels, Roy S.
in
Adult
,
Behavioral Science and Psychology
,
Cognitive Psychology
2024
We built a novel setup to record large gaze shifts (up to 140
∘
). The setup consists of a wearable eye tracker and a high-speed camera with fiducial marker technology to track the head. We tested our setup by replicating findings from the classic eye–head gaze shift literature. We conclude that our new inexpensive setup is good enough to investigate the dynamics of large eye–head gaze shifts. This novel setup could be used for future research on large eye–head gaze shifts, but also for research on gaze during e.g., human interaction. We further discuss reference frames and terminology in head-free eye tracking. Despite a transition from head-fixed eye tracking to head-free gaze tracking, researchers still use head-fixed eye movement terminology when discussing world-fixed gaze phenomena. We propose to use more specific terminology for world-fixed phenomena, including gaze fixation, gaze pursuit, and gaze saccade.
Journal Article
Co-coding of head and whisker movements by both VPM and POm thalamic neurons
by
Ahissar, Ehud
,
Tenzer, Alon
,
Saraf-Sinik, Inbar
in
631/378/2620/1838
,
631/378/2629/2630
,
631/378/3917
2024
Rodents continuously move their heads and whiskers in a coordinated manner while perceiving objects through whisker-touch. Studies in head-fixed rodents showed that the ventroposterior medial (VPM) and posterior medial (POm) thalamic nuclei code for whisker kinematics, with POm involvement reduced in awake animals. To examine VPM and POm involvement in coding head and whisker kinematics in awake, head-free conditions, we recorded thalamic neuronal activity and tracked head and whisker movements in male mice exploring an open arena. Using optogenetic tagging, we found that in freely moving mice, both nuclei equally coded whisker kinematics and robustly coded head kinematics. The fraction of neurons coding head kinematics increased after whisker trimming, ruling out whisker-mediated coding. Optogenetic activation of thalamic neurons evoked overt kinematic changes and increased the fraction of neurons leading changes in head kinematics. Our data suggest that VPM and POm integrate head and whisker information and can influence head kinematics during tactile perception.
Whether the posterior medial (POm) thalamic nucleus processes whisking kinematics was not clear from studies in head-fixed rodents. By studying freely moving mice, here authors demonstrate that the POm does encode whisker kinematics. Additionally, they show that both POm and the ventroposterior medial (VPM) thalamic nuclei process and can influence head kinematics.
Journal Article
Motion-corrected eye tracking improves gaze accuracy during visual fMRI experiments
2025
Human eye movements are essential for understanding cognition, yet achieving high-precision eye tracking in functional Magnetic Resonance Imaging (fMRI) remains challenging. Even slight head shifts from the initial calibration position can introduce drift in eye tracking data, leading to substantial gaze inaccuracies. To address this, we present Motion-Corrected Eye Tracking (MoCET), which corrects drift using head motion parameters derived from fMRI preprocessing. MoCET requires no additional hardware and can be applied retrospectively to existing datasets. We show that it outperforms conventional detrending methods with respect to accuracy of gaze estimation and offers higher spatial and temporal precision compared to magnetic resonance-based eye tracking approaches. By overcoming a key limitation in integrating eye tracking with fMRI, MoCET enables precise investigations of naturalistic vision and cognition in fMRI research.
Head movements during eye tracking in fMRI frequently distort gaze signals. This paper presents motion-corrected eye tracking, a method that corrects head motion-induced errors to produce accurate gaze data in fMRI experiments.
Journal Article