Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Series Title
      Series Title
      Clear All
      Series Title
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Content Type
    • Item Type
    • Is Full-Text Available
    • Subject
    • Country Of Publication
    • Publisher
    • Source
    • Target Audience
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
17,538 result(s) for "Gestures"
Sort by:
Human kindness
Kindness comes in many forms and affects all of us. As Mark Twain said, 'Kindness is the language which the deaf can hear and the blind can see.' And while a kind gesture can often simply make someone feel better about their day, sometimes as the twenty-five true stories collected here show it can save a life. Sourced from around the world, these are stories of the everyday and the extraordinary. From the woman who stopped a suicidal man from jumping just by taking the time to listen to him, to the couple who fostered a baby they found abandoned in a rubbish bin when no one else could help; from the students who came to the rescue of an elderly man fallen on black ice, to the response of a terrorist leader when confronted by a young child's cries for her favourite doll these are stories of unexpected kindness that had a lasting impact on the recipient. Interspersed between the stories are quotes about kindness by people as diverse as Audrey Hepburn, Lao Tzu, Ellen DeGeneres and Ralph Waldo Emerson. The result is a book that explores all that is best about human nature.
Great ape gestures: intentional communication with a rich set of innate signals
Great apes give gestures deliberately and voluntarily, in order to influence particular target audiences, whose direction of attention they take into account when choosing which type of gesture to use. These facts make the study of ape gesture directly relevant to understanding the evolutionary precursors of human language; here we present an assessment of ape gesture from that perspective, focusing on the work of the “St Andrews Group” of researchers. Intended meanings of ape gestures are relatively few and simple. As with human words, ape gestures often have several distinct meanings, which are effectively disambiguated by behavioural context. Compared to the signalling of most other animals, great ape gestural repertoires are large. Because of this, and the relatively small number of intended meanings they achieve, ape gestures are redundant, with extensive overlaps in meaning. The great majority of gestures are innate, in the sense that the species’ biological inheritance includes the potential to develop each gestural form and use it for a specific range of purposes. Moreover, the phylogenetic origin of many gestures is relatively old, since gestures are extensively shared between different genera in the great ape family. Acquisition of an adult repertoire is a process of first exploring the innate species potential for many gestures and then gradual restriction to a final (active) repertoire that is much smaller. No evidence of syntactic structure has yet been detected.
On free-hand TV control: experimental results on user-elicited gestures with Leap Motion
We present insights from a gesture elicitation study conducted for TV control, during which 18 participants contributed gesture commands and rated the execution difficulty and recall likeliness of free-hand gestures for 21 television control tasks. Our study complements previous work on gesture interaction design for the TV set with the first exploration of fine-grained resolution 3-D finger movements and hand gestures . We report lower agreement rates than previous gesture studies ( AR = . 158 ) with 72.8 % recall rate and 15.8 % false positives, results that are explained by the complexity and variability of unconstrained finger and hand gestures. However, our observations also confirm previous findings, such as people preferring related gestures for dichotomous tasks and more disagreement occurring for abstract tasks, such as “open browser” or “show the list of channels” for our specific TV scenario. To reach a better understanding of our participants’ preferences for articulating finger and hand gestures, we defined five measures for Leap Motion gestures, such as gesture volume and finger-to-palm distance , which we employed to evaluate gestures performed by our participants. We also contribute a set of guidelines for practitioners interested in designing free-hand gestures for interactive TV scenarios involving similar gesture acquisition technology. We release our dataset consisting in 378 Leap Motion gestures described by fingertips position, direction, and velocity coordinates to foster further studies in the community. This first exploration of viewers’ preferences for fine-grained resolution free-hand gestures for TV control represents one more step toward designing low-effort gesture interfaces for lean-back interaction with the TV set.
A systematic review on hand gesture recognition techniques, challenges and applications
With the development of today's technology, and as humans tend to naturally use hand gestures in their communication process to clarify their intentions, hand gesture recognition is considered to be an important part of Human Computer Interaction (HCI), which gives computers the ability of capturing and interpreting hand gestures, and executing commands afterwards. The aim of this study is to perform a systematic literature review for identifying the most prominent techniques, applications and challenges in hand gesture recognition. To conduct this systematic review, we have screened 560 papers retrieved from IEEE Explore published from the year 2016 to 2018, in the searching process keywords such as \"hand gesture recognition\" and \"hand gesture techniques\" have been used. However, to focus the scope of the study 465 papers have been excluded. Only the most relevant hand gesture recognition works to the research questions, and the well-organized papers have been studied. The results of this paper can be summarized as the following; the surface electromyography (sEMG) sensors with wearable hand gesture devices were the most acquisition tool used in the work studied, also Artificial Neural Network (ANN) was the most applied classifier, the most popular application was using hand gestures for sign language, the dominant environmental surrounding factor that affected the accuracy was the background color, and finally the problem of overfitting in the datasets was highly experienced. The paper will discuss the gesture acquisition methods, the feature extraction process, the classification of hand gestures, the applications that were recently proposed, the challenges that face researchers in the hand gesture recognition process, and the future of hand gesture recognition. We shall also introduce the most recent research from the year 2016 to the year 2018 in the field of hand gesture recognition for the first time.
Real-Time Hand Gesture Recognition Using Fine-Tuned Convolutional Neural Network
Hand gesture recognition is one of the most effective modes of interaction between humans and computers due to being highly flexible and user-friendly. A real-time hand gesture recognition system should aim to develop a user-independent interface with high recognition performance. Nowadays, convolutional neural networks (CNNs) show high recognition rates in image classification problems. Due to the unavailability of large labeled image samples in static hand gesture images, it is a challenging task to train deep CNN networks such as AlexNet, VGG-16 and ResNet from scratch. Therefore, inspired by CNN performance, an end-to-end fine-tuning method of a pre-trained CNN model with score-level fusion technique is proposed here to recognize hand gestures in a dataset with a low number of gesture images. The effectiveness of the proposed technique is evaluated using leave-one-subject-out cross-validation (LOO CV) and regular CV tests on two benchmark datasets. A real-time American sign language (ASL) recognition system is developed and tested using the proposed technique.
Gesture-Based Physical Stability Classification and Rehabilitation System
This paper introduces the Gesture-Based Physical Stability Classification and Rehabilitation System (GPSCRS), a low-cost, non-invasive solution for evaluating physical stability using an Arduino microcontroller and the DFRobot Gesture and Touch sensor. The system quantifies movement smoothness, consistency, and speed by analyzing “up” and “down” hand gestures over a fixed period, generating a Physical Stability Index (PSI) as a single metric to represent an individual’s stability. The system focuses on a temporal analysis of gesture patterns while incorporating placeholders for speed scores to demonstrate its potential for a comprehensive stability assessment. The performance of various machine learning and deep learning models for gesture-based classification is evaluated, with neural network architectures such as Transformer, CNN, and KAN achieving perfect scores in recall, accuracy, precision, and F1-score. Traditional machine learning models such as XGBoost show strong results, offering a balance between computational efficiency and accuracy. The choice of model depends on specific application requirements, including real-time constraints and available resources. The preliminary experimental results indicate that the proposed GPSCRS can effectively detect changes in stability under real-time conditions, highlighting its potential for use in remote health monitoring, fall prevention, and rehabilitation scenarios. By providing a quantitative measure of stability, the system enables early risk identification and supports tailored interventions for improved mobility and quality of life.
Visuo-spatial complexity potentiates the body-part effect in intransitive imitation of meaningless gestures
Recent studies on the imitation of intransitive gestures suggest that the body part effect relies mainly upon the direct route of the dual-route model through a visuo-transformation mechanism. Here, we test the visuo-constructive hypothesis which posits that the visual complexity may directly potentiate the body part effect for meaningless gestures. We predicted that the difference between imitation of hand and finger gestures would increase with the visuo-spatial complexity of gestures. Second, we aimed to identify some of the visuo-spatial predictors of meaningless finger imitation skills. Thirty-eight participants underwent an imitation task containing three distinct set of gestures, that is, meaningful gestures, meaningless gestures with low visual complexity, and meaningless gestures with higher visual complexity than the first set of meaningless gestures. Our results were in general agreement with the visuo-constructive hypothesis, showing an increase in the difference between hand and finger gestures, but only for meaningless gestures with higher visuo-spatial complexity. Regression analyses confirm that imitation accuracy decreases with resource-demanding visuo-spatial factors. Taken together, our results suggest that the body part effect is highly dependent on the visuo-spatial characteristics of the gestures.
Italian Sign Language from a Cognitive and Socio-Semiotic Perspective
This volume reveals new insights on the faculty of language. By proposing a new approach in the analysis and description of Italian Sign Language (LIS), that can be extended also to other sign languages.
Abnormal Gesture Perception and Clinical High-Risk for Psychosis
Abstract Individuals diagnosed with psychotic disorders exhibit abnormalities in the perception of expressive behaviors, which are linked to symptoms and visual information processing domains. Specifically, literature suggests these groups have difficulties perceiving gestures that accompany speech. While our understanding of gesture perception in psychotic disorders is growing, gesture perception abnormalities and clues about potential causes and consequences among individuals meeting criteria for a clinical high-risk (CHR) syndrome is limited. Presently, 29 individuals with a CHR syndrome and 32 healthy controls completed an eye-tracking gesture perception paradigm. In this task, participants viewed an actor using abstract and literal gestures while presenting a story and eye gaze data (eg, fixation counts and total fixation time) was collected. Furthermore, relationships between fixation variables and both symptoms (positive, negative, anxiety, and depression) and measures of visual information processing (working memory and attention) were examined. Findings revealed that the CHR group gazed at abstract gestures fewer times than the control group. When individuals in the CHR group did gaze at abstract gestures, on average, they spent significantly less time fixating compared to controls. Furthermore, reduced fixation (ie, count and time) was related to depression and slower response time on an attentional task. While a similar pattern of group differences in the same direction appeared for literal gestures, the effect was not significant. These data highlight the importance of integrating gesture perception abnormalities into vulnerability models of psychosis and inform the development of targeted treatments for social communicative deficits.