Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
5,147 result(s) for "Hand movements"
Sort by:
The statistics of natural hand movements
Humans constantly use their hands to interact with the environment and they engage spontaneously in a wide variety of manual activities during everyday life. In contrast, laboratory-based studies of hand function have used a limited range of predefined tasks. The natural movements made by the hand during everyday life have thus received little attention. Here, we developed a portable recording device that can be worn by subjects to track movements of their right hand as they go about their daily routine outside of a laboratory setting. We analyse the kinematic data using various statistical methods. Principal component analysis of the joint angular velocities showed that the first two components were highly conserved across subjects, explained 60% of the variance and were qualitatively similar to those reported in previous studies of reach-to-grasp movements. To examine the independence of the digits, we developed a measure based on the degree to which the movements of each digit could be linearly predicted from the movements of the other four digits. Our independence measure was highly correlated with results from previous studies of the hand, including the estimated size of the digit representations in primary motor cortex and other laboratory measures of digit individuation. Specifically, the thumb was found to be the most independent of the digits and the index finger was the most independent of the fingers. These results support and extend laboratory-based studies of the human hand.
Motor congruency and multisensory integration jointly facilitate visual information processing before movement execution
Attention allows us to select important sensory information and enhances sensory information processing. Attention and our motor system are tightly coupled: attention is shifted to the target location before a goal-directed eye- or hand movement is executed. Congruent eye–hand movements to the same target can boost the effect of this pre-movement shift of attention. Moreover, visual information processing can be enhanced by, for example, auditory input presented in spatial and temporal proximity of visual input via multisensory integration (MSI). In this study, we investigated whether the combination of MSI and motor congruency can synergistically enhance visual information processing beyond what can be observed using motor congruency alone. Participants performed congruent eye- and hand movements during a 2-AFC visual discrimination task. The discrimination target was presented in the planning phase of the movements at the movement target location or a movement irrelevant location. Three conditions were compared: (1) a visual target without sound, (2) a visual target with sound spatially and temporally aligned (MSI) and (3) a visual target with sound temporally misaligned (no MSI). Performance was enhanced at the movement-relevant location when congruent motor actions and MSI coincide compared to the other conditions. Congruence in the motor system and MSI together therefore lead to enhanced sensory information processing beyond the effects of motor congruency alone, before a movement is executed. Such a synergy implies that the boost of attention previously observed for the independent factors is not at ceiling level, but can be increased even further when the right conditions are met.
Grip forces during fast point-to-point and continuous hand movements
Three experiments investigated the grip force exerted by the fingers on an object displaced actively in the near-body space. In one condition (unimanual) the object was held by one hand with the tripod grip and was moved briskly back and forth along one of the three coordinate directions (up–down, left–right, near–far). In the second condition (bimanual) the same point-to-point movements were performed while holding the object with the index and middle fingers of both hands. In the third condition (bimanual) the object was held as in the second condition and moved along a circular path lying in one of the three coordinate planes (horizontal, frontal, sagittal). In all conditions participants were asked to exert a baseline level of grip force largely exceeding the safety margin against slippage. Both grip forces and hand displacements were measured with high accuracy. As reported in previous studies, in the two point-to-point conditions we observed an upsurge of the grip force at the onset and at the end the movements. However, the timing of the transient increases of the grip force relative to hand kinematics did not confirm the hypothesis set forth by several previous studies that grip modulation is a pre-planned action based on an internal model of the expected effects of the movement. In the third condition, the systematic modulation of the grip force also for circular movements was again at variance with the internal model hypothesis because it cannot be construed as a pre-planned action aiming at countering large changes in dynamic load. We argue that a parsimonious account of the covariations of load and grip forces can be offered by taking into account the visco-elastic properties of the neuromuscular system.
Nonverbal Hand Movement Durations Indicate Post-Concussion Symptoms of Athletes
Methods of post-concussion diagnosis are still under debate with regard to sensitivity, objectivity, reliability, and costs. Spontaneous displays of nonverbal hand movement behavior during interaction are indicative of psychopathology and are relatively simple to record and analyze. Increased continuous (irregular) body-focused hand movement activity in particular indicates psychopathologies that overlap in symptomatology with those of sport-related concussions (SRCs). We therefore hypothesized that the duration of “irregular,” “on body,” and “act on each other” hand movements is increased in athletes with SRC who suffer from post-concussion symptoms. Three matched groups of 40 athletes were investigated: 14 symptomatic athletes with a concussion, 14 asymptomatic athletes with a concussion, and 12 non-concussed athletes. Using the Neuropsychological Gesture (NEUROGES)-Elan analysis system, four certified raters analyzed all nonverbal hand movements that were displayed during a videotaped standardized anamnesis about concussion history, incidence, course of action, and post-concussion symptoms. The duration of irregular Structure units among symptomatic athletes was significantly longer compared with asymptomatic athletes. Irregular, on body, and act on each other hand movement durations correlated with post-concussion symptoms. Whereas the duration of irregular units significantly predicted the post-concussion symptom score, working memory performances showed only marginal effects. Increased duration of irregular hand movement units indicates post-concussion symptoms in athletes with SRC. Because the recording of spontaneous displays of nonverbal hand movement behavior is relatively simple and cost efficient, we suggest using the neuropsychological analysis of hand movement behavior as a future diagnostic parameter of concussion management protocols.
Hand movements with a phase structure and gestures that depict action stem from a left hemispheric system of conceptualization
The present study addresses the previously discussed controversy on the contribution of the right and left cerebral hemispheres to the production and conceptualization of spontaneous hand movements and gestures. Although it has been shown that each hemisphere contains the ability to produce hand movements, results of left hemispherically lateralized motor functions challenge the view of a contralateral hand movement production system. To examine hemispheric specialization in hand movement and gesture production, ten right-handed participants were tachistoscopically presented pictures of everyday life actions. The participants were asked to demonstrate with their hands, but without speaking what they had seen on the drawing. Two independent blind raters evaluated the videotaped hand movements and gestures employing the Neuropsychological Gesture Coding System. The results showed that the overall frequency of right- and left-hand movements is equal independent of stimulus lateralization. When hand movements were analyzed considering their Structure , the presentation of the action stimuli to the left hemisphere resulted in more hand movements with a phase structure than the presentation to the right hemisphere. Furthermore, the presentation to the left hemisphere resulted in more right and left-hand movements with a phase structure, whereas the presentation to the right hemisphere only increased contralateral left-hand movements with a phase structure as compared to hand movements without a phase structure. Gestures that depict action were primarily displayed in response to stimuli presented in the right visual field than in the left one. The present study shows that both hemispheres possess the faculty to produce hand movements in response to action stimuli. However, the left hemisphere dominates the production of hand movements with a phase structure and gestures that depict action. We therefore conclude that hand movements with a phase structure and gestures that represent action stem from a left hemispheric system of conceptualization.
The latency for correcting a movement depends on the visual attribute that defines the target
Neurons in different cortical visual areas respond to different visual attributes with different latencies. How does this affect the on-line control of our actions? We studied hand movements directed toward targets that could be distinguished from other objects by luminance, size, orientation, color, shape or texture. In some trials, the target changed places with one of the other objects at the onset of the hand’s movement. We determined the latency for correcting the movement of the hand in the direction of the new target location. We show that subjects can correct their movements at short latency for all attributes, but that responses for the attributes color, form and texture (that are relevant for recognizing the object) are 50 ms slower than for the attributes luminance, orientation and size. This dichotomy corresponds to both to the distinction between magno-cellular and parvo-cellular pathways and to a dorsal–ventral distinction. The latency also differed systematically between subjects, independent of their reaction time.
Functional and Structural Properties of Interhemispheric Interaction between Bilateral Precentral Hand Motor Regions in a Top Wheelchair Racing Paralympian
Long-term motor training can cause functional and structural changes in the human brain. Assessing how the training of specific movements affects specific parts of the neural circuitry is essential to understand better the underlying mechanisms of motor training-induced plasticity in the human brain. We report a single-case neuroimaging study that investigated functional and structural properties in a professional athlete of wheelchair racing. As wheelchair racing requires bilateral synchronization of upper limb movements, we hypothesized that functional and structural properties of interhemispheric interactions in the central motor system might differ between the professional athlete and controls. Functional and diffusion magnetic resonance imaging (fMRI and dMRI) data were obtained from a top Paralympian (P1) in wheelchair racing. With 23 years of wheelchair racing training starting at age eight, she holds an exceptional competitive record. Furthermore, fMRI and dMRI data were collected from three other paraplegic participants (P2-P4) with long-term wheelchair sports training other than wheelchair racing and 37 able-bodied control volunteers. Based on the fMRI data analyses, P1 showed activation in the bilateral precentral hand sections and greater functional connectivity between these sections during a right-hand unimanual task. In contrast, other paraplegic participants and controls showed activation in the contralateral hemisphere and deactivation in the ipsilateral hemisphere. Moreover, dMRI data analysis revealed that P1 exhibited significantly lower mean diffusivity along the transcallosal pathway connecting the bilateral precentral motor regions than control participants, which was not observed in the other paraplegic participants. These results suggest that long-term training with bilaterally synchronized upper-limb movements may promote bilateral recruitment of the precentral hand sections. Such recruitment may affect the structural circuitry involved in the interhemispheric interaction between the bilateral precentral regions. This study provides valuable evidence of the extreme adaptability of the human brain.
SignFormer-GCN: Continuous sign language translation using spatio-temporal graph convolutional networks
Sign language is a complex visual language system that uses hand gestures, facial expressions, and body movements to convey meaning. It is the primary means of communication for millions of deaf and hard-of-hearing individuals worldwide. Tracking physical actions, such as hand movements and arm orientation, alongside expressive actions, including facial expressions, mouth movements, eye movements, eyebrow gestures, head movements, and body postures, using only RGB features can be limiting due to discrepancies in backgrounds and signers across different datasets. Despite this limitation, most Sign Language Translation (SLT) research relies solely on RGB features. We used keypoint features, and RGB features to capture better the pose and configuration of body parts involved in sign language actions and complement the RGB features. Similarly, most works on SLT research have used transformers, which are good at capturing broader, high-level context and focusing on the most relevant video frames. Still, the inherent graph structure associated with sign language is neglected and fails to capture low-level details. To solve this, we used a joint encoding technique using a transformer and STGCN architecture to capture the context of sign language expressions and spatial and temporal dependencies on skeleton graphs. Our method, SignFormer-GCN, achieves competitive performance in RWTH-PHOENIX-2014T, How2Sign, and BornilDB v1.0 datasets experimentally, showcasing its effectiveness in enhancing translation accuracy through different sign languages. The code is available at the following link: https://github.com/rabeya-akter/SignLanguageTranslation .
Doing Psychological Science by Hand
Over the past decade, mouse tracking in choice tasks has become a popular method across psychological science. This method exploits hand movements as a measure of multiple response activations that can be tracked continuously over hundreds of milliseconds. Whereas early mouse-tracking research focused on specific debates, researchers have realized that the methodology has far broader theoretical value. This more recent work demonstrates that mouse tracking is a widely applicable measure across the field, capable of exposing the microstructure of real-time decisions, including their component processes and millisecond-resolution time course, in ways that inform theory. In this article, recent advances in the mouse-tracking approach are described, and comparisons with the gold standard measure of reaction time and other temporally sensitive methodologies are provided. Future directions, including mapping to neural representations with brain imaging and ways to improve our theoretical understanding of mouse-tracking methodology, are discussed.
Slightly perturbing the arm influences choices between multiple targets
We constantly make choices about how to interact with objects in the environment. Do we immediately consider changes in our posture when making such choices? To find out, we examined whether motion in the background, which is known to influence the trajectory of goal-directed hand movements, influences participants' choices when suddenly faced with two options. The participants' task was to tap on as many sequentially presented targets as possible within 90 seconds. Sometime after a new target appeared, it split into two targets and participants had to choose which of them to hit. Shortly before the split, the background moved in a way that was expected to result in the finger shifting slightly towards one of the two new targets. We examined whether such shifts influenced the choice between the two targets. The moving background influenced the finger movements in the expected manner: participants moved in the direction of the background motion. It also influenced the choice that participants made between the two targets: participants more frequently chose the target in the direction of the background motion. There was a positive correlation across participants between the magnitude of the response to background motion and the bias to choose the target in the direction of such motion. Thus, people consider sudden changes in their posture when choosing between different movement options.