Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Reading LevelReading Level
-
Content TypeContent Type
-
YearFrom:-To:
-
More FiltersMore FiltersItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
2,074
result(s) for
"Robot hands."
Sort by:
Robotic tactile perception and understanding : a sparse coding method
This book introduces the challenges of robotic tactile perception and task understanding, and describes an advanced approach based on machine learning and sparse coding techniques. Further, a set of structured sparse coding models is developed to address the issues of dynamic tactile sensing. The book then proves that the proposed framework is effective in solving the problems of multi-finger tactile object recognition, multi-label tactile adjective recognition and multi-category material analysis, which are all challenging practical problems in the fields of robotics and automation. The proposed sparse coding model can be used to tackle the challenging visual-tactile fusion recognition problem, and the book develops a series of efficient optimization algorithms to implement the model. It is suitable as a reference book for graduate students with a basic knowledge of machine learning as well as professional researchers interested in robotic tactile perception and understanding, and machine learning.
Grasp quality measures: review and performance
2015
The correct grasp of objects is a key aspect for the right fulfillment of a given task. Obtaining a good grasp requires algorithms to automatically determine proper contact points on the object as well as proper hand configurations, especially when dexterous manipulation is desired, and the quantification of a good grasp requires the definition of suitable grasp quality measures. This article reviews the quality measures proposed in the literature to evaluate grasp quality. The quality measures are classified into two groups according to the main aspect they evaluate: location of contact points on the object and hand configuration. The approaches that combine different measures from the two previous groups to obtain a global quality measure are also reviewed, as well as some measures related to human hand studies and grasp performance. Several examples are presented to illustrate and compare the performance of the reviewed measures.
Journal Article
The mechanics of robot grasping
\"In this comprehensive textbook about robot grasping, you will find an integrated look at the major concepts and technical results in robot grasp mechanics. A large body of prior research, including key theories, graphical techniques, and insights on robot hand designs, is organized into a systematic review, using common notation and a common analytical framework. With introductory and advanced chapters that support senior undergraduate and graduate level robotics courses, this book provides a full introduction to robot grasping principles that are needed to model and analyze multi-finger robot grasps, and serves as a valuable reference for robotics students, researchers, and practicing robot engineers. Each chapter contains many worked-out examples, exercises with full solutions, and figures that highlight new concepts and help the reader master the use of the theories and equations presented\"-- Provided by publisher.
Fundamentals of robotic grasping and fixturing
by
Xiong, Youlun
,
Ding, Han
,
Xiong, Caihua
in
Design and construction
,
Industrial and Systems Engineering
,
Mathematical models
2007
This book provides a fundamental knowledge of robotic grasping and fixturing (RGF) manipulation. For RGF manipulation to become a science rather than an art, the content of the book is uniquely designed for a thorough understanding of the RGF from the multifingered robot hand grasp, basic fixture design principle, and evaluating and planning of robotic grasping/fixturing, and focuses on the modeling and applications of the RGF.
Universally Grasping Objects with Granular—Tendon Finger: Principle and Design
2023
Nowadays, achieving the stable grasping of objects in robotics requires an increased emphasis on soft interactions. This research introduces a novel gripper design to achieve a more universal object grasping. The key feature of this gripper design was a hybrid mechanism that leveraged the soft structure provided by multiple granular pouches attached to the finger skeletons. To evaluate the performance of the gripper, a series of experiments were conducted using fifteen distinct types of objects, including cylinders, U-shaped brackets, M3 bolts, tape, pyramids, big pyramids, oranges, cakes, coffee sachets, spheres, drink sachets, shelves, pulley gears, aluminium profiles, and flat brackets. Our experimental results demonstrated that our gripper design achieved high success rates in gripping objects weighing less than 210 g. One notable advantage of the granular-tendon gripper was its ability to generate soft interactions during the grasping process while having a skeleton support to provide strength. This characteristic enabled the gripper to adapt effectively to various objects, regardless of their shape and material properties. Consequently, this work presented a promising solution for manipulating a wide range of objects with both stability and soft interaction capabilities, regardless of their individual characteristics.
Journal Article
Methods for Simultaneous Robot-World-Hand–Eye Calibration: A Comparative Study
2019
In this paper, we propose two novel methods for robot-world-hand–eye calibration and provide a comparative analysis against six state-of-the-art methods. We examine the calibration problem from two alternative geometrical interpretations, called ‘hand–eye’ and ‘robot-world-hand–eye’, respectively. The study analyses the effects of specifying the objective function as pose error or reprojection error minimization problem. We provide three real and three simulated datasets with rendered images as part of the study. In addition, we propose a robotic arm error modeling approach to be used along with the simulated datasets for generating a realistic response. The tests on simulated data are performed in both ideal cases and with pseudo-realistic robotic arm pose and visual noise. Our methods show significant improvement and robustness on many metrics in various scenarios compared to state-of-the-art methods.
Journal Article
Human Preferences for Robot Eye Gaze in Human-to-Robot Handovers
2022
This paper investigates human’s preferences for a robot’s eye gaze behavior during human-to-robot handovers. We studied gaze patterns for all three phases of the handover process: reach, transfer, and retreat, as opposed to previous work which only focused on the reaching phase. Additionally, we investigated whether the object’s size or fragility or the human’s posture affect the human’s preferences for the robot gaze. A public data-set of human-human handovers was analyzed to obtain the most frequent gaze behaviors that human receivers perform. These were then used to program the robot’s receiver gaze behaviors. In two sets of user studies (video and in-person), a collaborative robot exhibited these gaze behaviors while receiving an object from a human. In the video studies, 72 participants watched and compared videos of handovers between a human actor and a robot demonstrating each of the three gaze behaviors. In the in-person studies, a different set of 72 participants physically performed object handovers with the robot and evaluated their perception of the handovers for the robot’s different gaze behaviors. Results showed that, for both observers and participants in a handover, when the robot exhibited
Face-Hand-Face
gaze (gazing at the giver’s face and then at the giver’s hand during the reach phase and back at the giver’s face during the retreat phase), participants considered the handover to be more likable, anthropomorphic, and communicative of timing
(
p
<
0.0001
)
. However, we did not find evidence of any effect of the object’s size or fragility or the giver’s posture on the gaze preference.
Journal Article
A Bibliometric Review of Brain–Computer Interfaces in Motor Imagery and Steady-State Visually Evoked Potentials for Applications in Rehabilitation and Robotics
by
Quiles-Cucarella, Eduardo
,
Chio, Nayibe
in
Bibliometrics
,
Brain research
,
Brain-Computer Interfaces
2024
In this paper, a bibliometric review is conducted on brain–computer interfaces (BCI) in non-invasive paradigms like motor imagery (MI) and steady-state visually evoked potentials (SSVEP) for applications in rehabilitation and robotics. An exploratory and descriptive approach is used in the analysis. Computational tools such as the biblioshiny application for R-Bibliometrix and VOSViewer are employed to generate data on years, sources, authors, affiliation, country, documents, co-author, co-citation, and co-occurrence. This article allows for the identification of different bibliometric indicators such as the research process, evolution, visibility, volume, influence, impact, and production in the field of brain–computer interfaces for MI and SSVEP paradigms in rehabilitation and robotics applications from 2000 to August 2024.
Journal Article
An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand
2019
Brain-computer interface (BCI) technology shows potential for application to motor rehabilitation therapies that use neural plasticity to restore motor function and improve quality of life of stroke survivors. However, it is often difficult for BCI systems to provide the variety of control commands necessary for multi-task real-time control of soft robot naturally. In this study, a novel multimodal human-machine interface system (mHMI) is developed using combinations of electrooculography (EOG), electroencephalography (EEG), and electromyogram (EMG) to generate numerous control instructions. Moreover, we also explore subject acceptance of an affordable wearable soft robot to move basic hand actions during robot-assisted movement. Six healthy subjects separately perform left and right hand motor imagery, looking-left and looking-right eye movements, and different hand gestures in different modes to control a soft robot in a variety of actions. The results indicate that the number of mHMI control instructions is significantly greater than achievable with any individual mode. Furthermore, the mHMI can achieve an average classification accuracy of 93.83% with the average information transfer rate of 47.41 bits/min, which is entirely equivalent to a control speed of 17 actions per minute. The study is expected to construct a more user-friendly mHMI for real-time control of soft robot to help healthy or disabled persons perform basic hand movements in friendly and convenient way.
Journal Article
Design and Development of an Adaptive Robotic Gripper
by
Deb, Alok Kanti
,
Ansary, Sainul Islam
,
Deb, Sankha
in
3-D printers
,
Actuation
,
Anthropomorphism
2023
In this paper, the design and development of an adaptive gripper are presented. Adaptive grippers are useful for grasping objects of varied geometric shapes by wrapping fingers around the object. The finger closing sequence in adaptive grippers may lead to ejection of the object from the gripper due to any unbalanced grasping force and such grasp failure is common for lightweight objects. Designing of the proposed gripper is focused on ensuring a stable grasp on a wide variety of objects, especially, lightweight objects (e.g., empty plastic bottles). The proposed actuation mechanism is based on movable pulleys and tendon wires which ensure that once a link stops moving, the other links continue to move and wrap around the object. Further, optimisation is used to improve the design of the adaptive gripper and the optimised gripper has been developed using 3D printing. Finally, validation is done by executing object grasping on common household objects using an industrial robot fitted with the developed gripper.
Journal Article