Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Series TitleSeries Title
-
Reading LevelReading Level
-
YearFrom:-To:
-
More FiltersMore FiltersContent TypeItem TypeIs Full-Text AvailableSubjectCountry Of PublicationPublisherSourceTarget AudienceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
17,813
result(s) for
"Gestures."
Sort by:
Human kindness
Kindness comes in many forms and affects all of us. As Mark Twain said, 'Kindness is the language which the deaf can hear and the blind can see.' And while a kind gesture can often simply make someone feel better about their day, sometimes as the twenty-five true stories collected here show it can save a life. Sourced from around the world, these are stories of the everyday and the extraordinary. From the woman who stopped a suicidal man from jumping just by taking the time to listen to him, to the couple who fostered a baby they found abandoned in a rubbish bin when no one else could help; from the students who came to the rescue of an elderly man fallen on black ice, to the response of a terrorist leader when confronted by a young child's cries for her favourite doll these are stories of unexpected kindness that had a lasting impact on the recipient. Interspersed between the stories are quotes about kindness by people as diverse as Audrey Hepburn, Lao Tzu, Ellen DeGeneres and Ralph Waldo Emerson. The result is a book that explores all that is best about human nature.
Great ape gestures: intentional communication with a rich set of innate signals
Great apes give gestures deliberately and voluntarily, in order to influence particular target audiences, whose direction of attention they take into account when choosing which type of gesture to use. These facts make the study of ape gesture directly relevant to understanding the evolutionary precursors of human language; here we present an assessment of ape gesture from that perspective, focusing on the work of the “St Andrews Group” of researchers. Intended meanings of ape gestures are relatively few and simple. As with human words, ape gestures often have several distinct meanings, which are effectively disambiguated by behavioural context. Compared to the signalling of most other animals, great ape gestural repertoires are large. Because of this, and the relatively small number of intended meanings they achieve, ape gestures are redundant, with extensive overlaps in meaning. The great majority of gestures are innate, in the sense that the species’ biological inheritance includes the potential to develop each gestural form and use it for a specific range of purposes. Moreover, the phylogenetic origin of many gestures is relatively old, since gestures are extensively shared between different genera in the great ape family. Acquisition of an adult repertoire is a process of first exploring the innate species potential for many gestures and then gradual restriction to a final (active) repertoire that is much smaller. No evidence of syntactic structure has yet been detected.
Journal Article
On free-hand TV control: experimental results on user-elicited gestures with Leap Motion
by
Vatavu, Radu-Daniel
,
Zaiţi, Ionuţ-Alexandru
,
Pentiuc, Ştefan-Gheorghe
in
Channels
,
Computer Science
,
Exploration
2015
We present insights from a gesture elicitation study conducted for TV control, during which 18 participants contributed gesture commands and rated the execution difficulty and recall likeliness of free-hand gestures for 21 television control tasks. Our study complements previous work on gesture interaction design for the TV set with the first exploration of
fine-grained resolution 3-D finger movements and hand gestures
. We report lower agreement rates than previous gesture studies (
AR
=
.
158
) with 72.8 % recall rate and 15.8 % false positives, results that are explained by the complexity and variability of unconstrained finger and hand gestures. However, our observations also confirm previous findings, such as people preferring related gestures for dichotomous tasks and more disagreement occurring for abstract tasks, such as “open browser” or “show the list of channels” for our specific TV scenario. To reach a better understanding of our participants’ preferences for articulating finger and hand gestures, we defined five measures for Leap Motion gestures, such as
gesture volume
and
finger-to-palm distance
, which we employed to evaluate gestures performed by our participants. We also contribute a set of guidelines for practitioners interested in designing free-hand gestures for interactive TV scenarios involving similar gesture acquisition technology. We release our dataset consisting in 378 Leap Motion gestures described by fingertips position, direction, and velocity coordinates to foster further studies in the community. This first exploration of viewers’ preferences for fine-grained resolution free-hand gestures for TV control represents one more step toward designing
low-effort gesture interfaces for lean-back interaction
with the TV set.
Journal Article
A systematic review on hand gesture recognition techniques, challenges and applications
2019
With the development of today's technology, and as humans tend to naturally use hand gestures in their communication process to clarify their intentions, hand gesture recognition is considered to be an important part of Human Computer Interaction (HCI), which gives computers the ability of capturing and interpreting hand gestures, and executing commands afterwards. The aim of this study is to perform a systematic literature review for identifying the most prominent techniques, applications and challenges in hand gesture recognition.
To conduct this systematic review, we have screened 560 papers retrieved from IEEE Explore published from the year 2016 to 2018, in the searching process keywords such as \"hand gesture recognition\" and \"hand gesture techniques\" have been used. However, to focus the scope of the study 465 papers have been excluded. Only the most relevant hand gesture recognition works to the research questions, and the well-organized papers have been studied.
The results of this paper can be summarized as the following; the surface electromyography (sEMG) sensors with wearable hand gesture devices were the most acquisition tool used in the work studied, also Artificial Neural Network (ANN) was the most applied classifier, the most popular application was using hand gestures for sign language, the dominant environmental surrounding factor that affected the accuracy was the background color, and finally the problem of overfitting in the datasets was highly experienced.
The paper will discuss the gesture acquisition methods, the feature extraction process, the classification of hand gestures, the applications that were recently proposed, the challenges that face researchers in the hand gesture recognition process, and the future of hand gesture recognition. We shall also introduce the most recent research from the year 2016 to the year 2018 in the field of hand gesture recognition for the first time.
Journal Article
Real-Time Hand Gesture Recognition Using Fine-Tuned Convolutional Neural Network
by
Sahoo, Jaya Prakash
,
Pławiak, Paweł
,
Prakash, Allam Jaya
in
Accuracy
,
Analysis
,
Classification
2022
Hand gesture recognition is one of the most effective modes of interaction between humans and computers due to being highly flexible and user-friendly. A real-time hand gesture recognition system should aim to develop a user-independent interface with high recognition performance. Nowadays, convolutional neural networks (CNNs) show high recognition rates in image classification problems. Due to the unavailability of large labeled image samples in static hand gesture images, it is a challenging task to train deep CNN networks such as AlexNet, VGG-16 and ResNet from scratch. Therefore, inspired by CNN performance, an end-to-end fine-tuning method of a pre-trained CNN model with score-level fusion technique is proposed here to recognize hand gestures in a dataset with a low number of gesture images. The effectiveness of the proposed technique is evaluated using leave-one-subject-out cross-validation (LOO CV) and regular CV tests on two benchmark datasets. A real-time American sign language (ASL) recognition system is developed and tested using the proposed technique.
Journal Article
Gesture-Based Physical Stability Classification and Rehabilitation System
2025
This paper introduces the Gesture-Based Physical Stability Classification and Rehabilitation System (GPSCRS), a low-cost, non-invasive solution for evaluating physical stability using an Arduino microcontroller and the DFRobot Gesture and Touch sensor. The system quantifies movement smoothness, consistency, and speed by analyzing “up” and “down” hand gestures over a fixed period, generating a Physical Stability Index (PSI) as a single metric to represent an individual’s stability. The system focuses on a temporal analysis of gesture patterns while incorporating placeholders for speed scores to demonstrate its potential for a comprehensive stability assessment. The performance of various machine learning and deep learning models for gesture-based classification is evaluated, with neural network architectures such as Transformer, CNN, and KAN achieving perfect scores in recall, accuracy, precision, and F1-score. Traditional machine learning models such as XGBoost show strong results, offering a balance between computational efficiency and accuracy. The choice of model depends on specific application requirements, including real-time constraints and available resources. The preliminary experimental results indicate that the proposed GPSCRS can effectively detect changes in stability under real-time conditions, highlighting its potential for use in remote health monitoring, fall prevention, and rehabilitation scenarios. By providing a quantitative measure of stability, the system enables early risk identification and supports tailored interventions for improved mobility and quality of life.
Journal Article
Gestural repair in Mandarin conversation
2022
Ever since Charles Goodwin’s seminal works on gaze, there has been a long-standing interest in Conversation Analysis in the interrelationship between talk and bodily conduct in the accomplishment of social action. Recently, a small but emerging body of research has explored the ways in which embodied conduct figures in the organization and operations of repair. In this article, I take up a similar theme and investigate the interaction between talk and iconic gestures in same-turn self-initiated repair in Mandarin conversation. The phenomenon I examine concerns the use of what I call “gestural repair.” The analysis focuses on how such repair can intertwine with talk in multi-stage operations in the progressivity and resolution of repair. The data are drawn from 50 hours of naturally-occurring conversations collected in China. Some unique features of such gestural repair observed in the Mandarin data are also discussed.
Journal Article
Italian Sign Language from a Cognitive and Socio-Semiotic Perspective
by
Di Renzo, Alessio
,
Fontana, Sabina
,
Volterra, Virginia
in
Cognition and language
,
Cognitive grammar
,
Gesture Studies
2022
This volume reveals new insights on the faculty of language. By proposing a new approach in the analysis and description of Italian Sign Language (LIS), that can be extended also to other sign languages.
Real-Time Human Detection and Gesture Recognition for On-Board UAV Rescue
2021
Unmanned aerial vehicles (UAVs) play an important role in numerous technical and scientific fields, especially in wilderness rescue. This paper carries out work on real-time UAV human detection and recognition of body and hand rescue gestures. We use body-featuring solutions to establish biometric communications, like yolo3-tiny for human detection. When the presence of a person is detected, the system will enter the gesture recognition phase, where the user and the drone can communicate briefly and effectively, avoiding the drawbacks of speech communication. A data-set of ten body rescue gestures (i.e., Kick, Punch, Squat, Stand, Attention, Cancel, Walk, Sit, Direction, and PhoneCall) has been created by a UAV on-board camera. The two most important gestures are the novel dynamic Attention and Cancel which represent the set and reset functions respectively. When the rescue gesture of the human body is recognized as Attention, the drone will gradually approach the user with a larger resolution for hand gesture recognition. The system achieves 99.80% accuracy on testing data in body gesture data-set and 94.71% accuracy on testing data in hand gesture data-set by using the deep learning method. Experiments conducted on real-time UAV cameras confirm our solution can achieve our expected UAV rescue purpose.
Journal Article