Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Analysis, Synthesis, and Perception of Multisensory Feedback in Touch
by
Lu, Shihan
in
Artificial intelligence
/ Computer Engineering
/ Computer science
/ Robotics
2024
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Analysis, Synthesis, and Perception of Multisensory Feedback in Touch
by
Lu, Shihan
in
Artificial intelligence
/ Computer Engineering
/ Computer science
/ Robotics
2024
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Analysis, Synthesis, and Perception of Multisensory Feedback in Touch
Dissertation
Analysis, Synthesis, and Perception of Multisensory Feedback in Touch
2024
Request Book From Autostore
and Choose the Collection Method
Overview
Going beyond vision has been essential for many areas, from providing immersive experience in virtual reality to providing robots with a human-like sensing capability. This need is particularly important when touch interactions occur or play a dominant role in the scenarios, such as perceiving the roughness of a material through scratching or localizing the contacts in a game of Jenga. Touch interaction is a multimodal exploration procedure (i.e., grasp, tap, hold) with multisensory feedback (i.e., forces, sounds, vibrations) produced during the action. The multimodal exploration and multisensory feedback around touch interactions constitute an interactive loop, where each action influences subsequent perception, and the perception inversely guides the next actions. The intricate interplay between action and perception in this loop forms the basis of our rich touch experiences, allowing us to gather detailed information about object properties, manipulate items with precision, and respond meaningfully to our environment. The interactive and multisensory nature of touch enables a rich array of feedback types triggered by various exploratory actions. Through careful design, these feedback mechanisms can even extend beyond natural human touch capabilities. The feedback produced during both human and artificial (robotic) touch interactions presents unique challenges and opportunities in terms of analysis, synthesis, perception, and applications. These challenges range from accurately capturing and interpreting complex tactile stimuli to synthesizing realistic touch sensations in virtual environments. In this thesis, I contribute to two main directions focusing on a multimodal and multisensory experience in touch-based interactions, to develop methods (1) to simulate and reconstruct touch-produced multisensory signals and (2) to analyze and decipher the multisensory information generated from active multimodal touch explorations. For (1), I present a new preference-driven haptic texture modeling framework, aimed at addressing the limitations of state-of-the-art data-driven frameworks, such as expensive recording devices, expertise in data collection, and inadaptability in models. The proposed framework combines the power of generative models for automatic texture generation and humans' capability of discerning texture details for interactive texture search via evolutionary strategies. It composes an iterative process for continuous tuning and refining of the modelled texture guided by human preference. Furthermore, I explore another important sensory modality produced in touch interactions - auditory feedback, particularly texture sounds. I propose a data-driven texture sound modeling and rendering approach for unconstrained tool-surface interactions, taking advantage of the hierarchical tree structure in wavelet transformations to decompose the recorded texture sound, and then reconstruct the new sounds with controllable uncertainties for different frequency components. The new generated virtual texture sounds realistically match user's motion in real time. For (2), inspired by the cross-talk across cortex regions for different sensory processing in the human brain, I present a feature extraction method using the crossmodal congruence between auditory and vibrotactile feedback. I propose a crossmodal inter-band spectral mapping that relates the frequency bands between the modalities to achieve a robust texture signature extraction. The proposed feature is evaluated using a large-scale texture classification task, indicating significant improvement in classification accuracy with a small amount of training data. Furthermore, with a focus on the grand state-aware robotic manipulation problem, I design a new touch sensing method using objects' acoustic responses under excitation, so-called active acoustic sensing. Important sensing capabilities, including object shape and material, grasping point, internal structure, and external contact with environments, are validated by both simulated and physical experiments. Lastly, I integrate this active acoustic sensing method into the robotic learning from demonstration pipeline and demonstrate its superior performance on contact-rich manipulation tasks.
Publisher
ProQuest Dissertations & Theses
Subject
ISBN
9798314860069
This website uses cookies to ensure you get the best experience on our website.