Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Series Title
      Series Title
      Clear All
      Series Title
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Content Type
    • Item Type
    • Is Full-Text Available
    • Subject
    • Country Of Publication
    • Publisher
    • Source
    • Target Audience
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
542 result(s) for "Hand-eye coordination"
Sort by:
Eye movement influences on coupled and decoupled eye-hand coordination tasks
Visually guided reaching precision and accuracy depend on the level of coupling between movements of the eyes and hand. In the present study, participants performed central fixations and either saccadic or smooth pursuit eye movements during fast and accurate reaching tasks involving eye–hand coupling and decoupling to better understand type of eye movement influence over upper limb control. Some eye–hand coupling and decoupling tasks also included hand reversals, where the hand moves away from the target to direct a cursor toward the target to account for various levels of hand–cursor and eye–cursor coupling. Regardless of eye-movement type, eye–hand–cursor coupling produced an endpoint accuracy advantage over decoupling. Use of hand reversal decreased peak speed and increased response time of the hand, whether considering fixation or a given eye movement. Use of smooth pursuit slowed hand movements relative to saccades, yet improved endpoint accuracy. Compared to central fixations, using smooth pursuit also slowed hand movements, while using saccades decreased, thus improved, hand reaction times. Data suggest an advantage, when using smooth pursuit to track the hand movement for the greatest endpoint accuracy, an advantage when using saccades for the fastest movements, and an eye–hand coupling advantage when using saccades for the shortest reactions. Researchers should provide clear eye-movement instructions for participants and/or monitor the eyes when assessing similar upper limb control to account for possible differences in eye movements used. Moreover, the type of eye movement chosen for participants should correspond to the primary goal of the task.
Specialization of reach function in human posterior parietal cortex
Posterior parietal cortex (PPC) plays an important role in the planning and control of goal-directed action. Single-unit studies in monkeys have identified reach-specific areas in the PPC, but the degree of effector and computational specificity for reach in the corresponding human regions is still under debate. Here, we review converging evidence spanning functional neuroimaging, parietal patient and transcranial magnetic stimulation studies in humans that suggests a functional topography for reach within human PPC. We contrast reach to saccade and grasp regions to distinguish functional specificity and also to understand how these different goal-directed actions might be coordinated at the cortical level. First, we present the current evidence for reach specificity in distinct modules in PPC, namely superior parietal occipital cortex, midposterior intraparietal cortex and angular gyrus, compared to saccade and grasp. Second, we review the evidence for hemispheric lateralization (both for hand and visual hemifield) in these reach representations. Third, we review evidence for computational reach specificity in these regions and finally propose a functional framework for these human PPC reach modules that includes (1) a distinction between the encoding of reach goals in posterior–medial PPC as opposed to reach movement vectors in more anterior–lateral PPC regions, and (2) their integration within a broader cortical framework for reach, grasp and eye–hand coordination. These findings represent both a confirmation and extension of findings that were previously reported for the monkey.
Eye–hand coordination: memory-guided grasping during obstacle avoidance
When reaching to grasp previously seen, now out-of-view objects, we rely on stored perceptual representations to guide our actions, likely encoded by the ventral visual stream. So-called memory-guided actions are numerous in daily life, for instance, as we reach to grasp a coffee cup hidden behind our morning newspaper. Little research has examined obstacle avoidance during memory-guided grasping, though it is possible obstacles with increased perceptual salience will provoke exacerbated avoidance maneuvers, like exaggerated deviations in eye and hand position away from obtrusive obstacles. We examined the obstacle avoidance strategies adopted as subjects reached to grasp a 3D target object under visually-guided (closed loop or open loop with full vision prior to movement onset) and memory-guided (short- or long-delay) conditions. On any given trial, subjects reached between a pair of flanker obstacles to grasp a target object. The positions and widths of the obstacles were manipulated, though their inner edges remained a constant distance apart. While reach and grasp behavior was consistent with the obstacle avoidance literature, in that reach, grasp, and gaze positions were biased away from obstacles most obtrusive to the reaching hand, our results reveal distinctive avoidance approaches undertaken depend on the availability of visual feedback. Contrary to expectation, we found subjects reaching to grasp after a long delay in the absence of visual feedback failed to modify their final fixation and grasp positions to accommodate the different positions of obstacles, demonstrating a more moderate, rather than exaggerative, obstacle avoidance strategy.
Eye-Hand Coordination in Children with High Functioning Autism and Asperger’s Disorder Using a Gap-Overlap Paradigm
We investigated eye-hand coordination in children with autism spectrum disorders (ASD) in comparison with age-matched normally developing peers. The eye-hand correlation was measured by putting fixation latencies in relation with pointing and key pressing responses in visual detection tasks where a gap-overlap paradigm was used and compared to fixation latencies in absence of manual response. ASD patients showed less efficient eye-hand coordination, which was particularly evident when pointing towards a target was being fixated. The data of normally developing participants confirmed that manual gap effects are more likely for more complex hand movements. An important discrepancy was discovered in participants with ASD: beside normal eye gap effects, they showed no concurrent hand gap effects when pointing to targets. This result has been interpreted as a further sign of inefficient eye-hand coordination in this patient population.
Hand-Eye Coordination Predicts Joint Attention
The present article shows that infant and dyad differences in hand-eye coordination predict dyad differences in joint attention (JA). In the study reported here, 51 toddlers ranging in age from 11 to 24 months and their parents wore head-mounted eye trackers as they played with objects together. We found that physically active toddlers aligned their looking behavior with their parent and achieved a substantial proportion of time spent jointly attending to the same object. However, JA did not arise through gaze following but rather through the coordination of gaze with manual actions on objects as both infants and parents attended to their partner's object manipulations. Moreover, dyad differences in JA were associated with dyad differences in hand following.
Three-dimensional binocular eye–hand coordination in normal vision and with simulated visual impairment
Sensorimotor coupling in healthy humans is demonstrated by the higher accuracy of visually tracking intrinsically—rather than extrinsically—generated hand movements in the fronto-parallel plane. It is unknown whether this coupling also facilitates vergence eye movements for tracking objects in depth, or can overcome symmetric or asymmetric binocular visual impairments. Human observers were therefore asked to track with their gaze a target moving horizontally or in depth. The movement of the target was either directly controlled by the observer’s hand or followed hand movements executed by the observer in a previous trial. Visual impairments were simulated by blurring stimuli independently in each eye. Accuracy was higher for self-generated movements in all conditions, demonstrating that motor signals are employed by the oculomotor system to improve the accuracy of vergence as well as horizontal eye movements. Asymmetric monocular blur affected horizontal tracking less than symmetric binocular blur, but impaired tracking in depth as much as binocular blur. There was a critical blur level up to which pursuit and vergence eye movements maintained tracking accuracy independent of blur level. Hand–eye coordination may therefore help compensate for functional deficits associated with eye disease and may be employed to augment visual impairment rehabilitation.
Effects of auditory feedback on movements with two-segment sequence and eye–hand coordination
The present study investigated the effect of auditory feedback on planning and control of two-segment reaching movements and eye–hand coordination. In particular, it was examined whether additional auditory information indicating the progression of the initial reach (i.e., passing the midway and contacting the target) affects the performance of that reach and gaze shift to the second target at the transition between two segments. Young adults performed a rapid two-segment reaching task, in which both the first and second segments had two target sizes. One out of three auditory feedback conditions included the reach-progression information: a continuous tone was delivered at a consistent timing during the initial reach from the midway to the target contact. Conversely, the other two were control conditions: a continuous tone was delivered at a random timing in one condition or not delivered in the other. The results showed that the initial reach became more accurate with the auditory reach-progression cue compared to without any auditory cue. When that cue was available, movement time of the initial reach was decreased, which was accompanied by an increased peak velocity and a decreased time to peak velocity. These findings suggest that the auditory reach-progression feedback enhanced the preplanned control of the initial reach. Deceleration time of that reach was also decreased with auditory feedback, but it was observed regardless of whether the sound contained the reach-progression information. At the transition between the two segments, the onset latencies of both the gaze shift and reach to the second target became shorter with the auditory reach-progression cue, the effect of which was pronounced when the initial reach had a higher terminal accuracy constraint. This suggests that the reach-progression cue enhanced verification of the termination of initial reach, thereby facilitating the initiation of eye and hand movements to the second target. Taken together, the additional auditory information of reach-progression enhances the planning and control of multi-segment reaches and eye–hand coordination at the segment transition.
Design of a Virtual Multi-Interaction Operation System for Hand–Eye Coordination of Grape Harvesting Robots
In harvesting operations, simulation verification of hand–eye coordination in a virtual canopy is critical for harvesting robot research. More realistic scenarios, vision-based driving motion, and cross-platform interaction information are needed to achieve such simulations, which are very challenging. Current simulations are more focused on path planning operations for consistency scenarios, which are far from satisfying the requirements. To this end, a new approach of visual servo multi-interaction simulation in real scenarios is proposed. In this study, a dual-arm grape harvesting robot in the laboratory is used as an example. To overcome these challenges, a multi-software federation is first proposed to establish their communication and cross-software sending of image information, coordinate information, and control commands. Then, the fruit recognition and positioning algorithm, forward and inverse kinematic model and simulation model are embedded in OpenCV and MATLAB, respectively, to drive the simulation run of the robot in V-REP, thus realizing the multi-interaction simulation of hand–eye coordination in virtual trellis vineyard. Finally, the simulation is verified, and the results show that the average running time of a string-picking simulation system is 6.5 s, and the success rate of accurate picking point grasping reached 83.3%. A complex closed loop of “scene-image recognition-grasping” is formed by data processing and transmission of various information. It can effectively realize the continuous hand–eye coordination multi-interaction simulation of the harvesting robot under the virtual environment.
Sex differences in the neural underpinnings of unimanual and bimanual control in adults
While many of the movements we make throughout our day involve just one upper limb, most daily movements require a certain degree of coordination between both upper limbs. Historically, sex differences in eye-hand coordination have been observed. As well, there are demonstrated sex-specific differences in hemisphere symmetry, interhemispheric connectivity, and motor cortex organization. While it has been suggested that these anatomical differences may underlie sex-related differences in performance, sex differences in the functional neural correlate underlying bimanual performance have not been explicitly investigated. In the current study we tested the hypothesis that the functional connectivity underlying bimanual movement control differed depending on the sex of an individual. Participants underwent MRI scanning to acquire anatomical and functional brain images. During the functional runs, participants performed unimanual and bimanual coordination tasks using two button boxes. The tasks included pressing the buttons in time to an auditory cue with either their left or their right hand individually (unimanual), or with both hands simultaneously (bimanual). The bimanual task was further divided into either an in-phase (mirror/symmetrical) or anti-phase (parallel/asymmetrical) condition. Participants were provided with extensive training to ensure task comprehension, and performance error rates were found to be equivalent between men and women. A generalized psychophysiological interaction (gPPI) analysis was implemented to examine how functional connectivity in each condition was modulated by sex. In support of our hypothesis, women and men demonstrated differences in the neural correlates underlying unimanual and bimanual movements. In line with previous literature, functional connectivity patterns showed sex-related differences for right- vs left-hand movements. Sex-specific functional connectivity during bimanual movements was not a sum of the functional connectivity underlying right- and left-hand unimanual movements. Further, women generally showed greater interhemispheric functional connectivity across all conditions compared to men and had greater connectivity between task-related cortical areas, while men had greater connectivity involving the cerebellum. Sex differences in brain connectivity were associated with both unimanual and bimanual movement control. Not only do these findings provide novel insight into the fundamentals of how the brain controls bimanual movements in both women and men, they also present potential clinical implications on how bimanual movement training used in rehabilitation can best be tailored to the needs of individuals.