Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
51 result(s) for "Haptic size perception"
Sort by:
‘See what you feel’: The impact of visual scale distance in haptic-to-visual crossmodal matching
Two experiments were conducted to explore the impact of the distance of a visual scale employed in the crossmodal matching method dubbed See What You Feel (SWYF) used to study the Uznadze haptic aftereffect. Previous studies reported that SWYF leads to a general underestimation of out-of-sight handheld spheres, which seems to increase with visual scale distance. Experiment 1 tested the effect of visual scale distance in haptic-to-visual crossmodal matching. A 19-step visual scale, made of actual 3D spheres (diameters ranging from 2.0 to 5.6 cm), was set at one of three possible distances (30, 160, 290 cm); participants’ task was to find the matching visual spheres for four out-of-sight handheld test spheres (diameters 3.0, 3.8, 4.6, 5.0 cm). Results confirmed the underestimation effect and only partially confirmed the role of scale distance. Experiment 2 investigated the role of scale distance in a visual-to-visual matching task in which the same visual scale was employed, set at one of three distances (37, 160, 290 cm). Participants’ task was to find a match for the same four test stimuli. Results showed no statistical difference between matched and actual sphere sizes with distance 37 cm; underestimations were observed with the far distances, thus reflecting overestimations of scale sphere sizes. Results from both experiments allow us to conclude that the underestimation effect observed with SWYF is a general feature of haptic-to-visual crossmodal matching, and that the SWYF method is a valuable tool for measuring haptic size perception with handheld stimuli when the visual scale is set at a visually comfortable peripersonal distance.
Haptic size perception is influenced by body and object orientation
Changes in body orientation from standing have been shown to impact our perception of visual size. This has been attributed to the vestibular system’s involvement in constructing a representation of the space around our body. In the current study we investigated how body posture influences haptic size perception. Blindfolded participants were tasked with estimating the felt length of a rod and then adjusting it back to its previously felt size (after it had been set to a random length). Participants could feel and adjust the rod in the same posture, standing or supine, or after a change in posture. If the body orientation relative to gravity impacts size perception, we might expect changes in haptic size perception following body tilt. In support of this hypothesis, after changing between standing and supine postures there was a change in the rod’s haptically perceived length but only when the orientation of the rod itself also changed with respect to gravity but not when its orientation was constant. This suggests that body posture influences not only visual but also haptic size perception, potentially due to the vestibular system contributing to the encoding of space with respect to gravity.
Encoding contact size using static and dynamic electrotactile finger stimulation: natural decoding vs. trained cues
Electrotactile stimulation through matrix electrodes is a promising technology to restore high-resolution tactile feedback in extended reality applications. One of the fundamental tactile effects that should be simulated is the change in the size of the contact between the finger and a virtual object. The present study investigated how participants perceive the increase of stimulation area when stimulating the index finger using static or dynamic (moving) stimuli produced by activating 1 to 6 electrode pads. To assess the ability to interpret the stimulation from the natural cues (natural decoding), without any prior training, the participants were instructed to draw the size of the stimulated area and identify the size difference when comparing two consecutive stimulations. To investigate if other “non-natural” cues can improve the size estimation, the participants were asked to enumerate the number of active pads following a training protocol. The results demonstrated that participants could perceive the change in size without prior training (e.g., the estimated area correlated with the stimulated area, p < 0.001; ≥ two-pad difference recognized with > 80% success rate). However, natural decoding was also challenging, as the response area changed gradually and sometimes in complex patterns when increasing the number of active pads (e.g., four extra pads needed for the statistically significant difference). Nevertheless, by training the participants to utilize additional cues the limitations of natural perception could be compensated. After the training, the mismatch in the activated and estimated number of pads was less than one pad regardless of the stimulus size. Finally, introducing the movement of the stimulus substantially improved discrimination (e.g., 100% median success rate to recognize ≥ one-pad difference). The present study, therefore, provides insights into stimulation size perception, and practical guidelines on how to modulate pad activation to change the perceived size in static and dynamic scenarios.
Turning perception on its head: cephalic perception of whole and partial length of a wielded object
Flexibility is a fundamental hallmark of perceptual systems. In particular, there is a great deal of flexibility in the ability to perceive properties of occluded objects by effortful or dynamic touch—hefting, wielding, or otherwise manipulating those objects by muscular effort. Perception of length of an occluded wielded object is comparable when that object is wielded by anatomical components that differ in sensitivity, dexterity, and functionality. Moreover, perception of this property is supported by an analogous sensitivity to inertial properties across such components. We investigated the ability to perceive whole and partial length of an object wielded by hand or by head. Experiment 1 found that perception of length by these anatomical components is qualitatively and quantitatively indistinguishable. Experiment 2 found that perception of length is supported by the same specific sensitivity to inertial properties in each case. Experiment 3 found that perception of whole length and partial length are each supported by specific sensitivities to inertial properties and that this is the case for both hand and by head. The results are discussed in the context of the nature of the stimulation patterns and the organization of the haptic system that are likely to support such flexibility in perception.
See What You Feel: A Crossmodal Tool for Measuring Haptic Size Illusions
The purpose of this research is to present the employment of a simple-to-use crossmodal method for measuring haptic size illusions. The method, that we call See what you feel, was tested by employing Uznadze’s classic haptic aftereffect in which two spheres physically identical (test spheres) appear different in size after that the hands holding them underwent an adaptation session with other two spheres (adapting spheres), one bigger and the other smaller than the two test spheres. To measure the entity of the illusion, a three-dimensional visual scale was created and participants were asked to find on it the spheres that corresponded in size to the spheres they were holding in their hands out of sight. The method, tested on 160 right-handed participants, is robust and easily understood by participants.
Exploring virtual reality object perception following sensory-motor interactions with different visuo-haptic collider properties
Interacting with the environment often requires the integration of visual and haptic information. Notably, perceiving external objects depends on how our brain binds sensory inputs into a unitary experience. The feedback provided by objects when we interact (through our movements) with them might then influence our perception. In VR, the interaction with an object can be dissociated by the size of the object itself by means of ‘colliders’ (interactive spaces surrounding the objects). The present study investigates possible after-effects in size discrimination for virtual objects after exposure to a prolonged interaction characterized by visual and haptic incongruencies. A total of 96 participants participated in this virtual reality study. Participants were distributed into four groups, in which they were required to perform a size discrimination task between two cubes before and after 15 min of a visuomotor task involving the interaction with the same virtual cubes. Each group interacted with a different cube where the visual (normal vs. small collider) and the virtual cube's haptic (vibration vs. no vibration) features were manipulated. The quality of interaction (number of touches and trials performed) was used as a dependent variable to investigate the performance in the visuomotor task. To measure bias in size perception, we compared changes in point of subjective equality (PSE) before and after the task in the four groups. The results showed that a small visual collider decreased manipulation performance, regardless of the presence or not of the haptic signal. However, change in PSE was found only in the group exposed to the small visual collider with haptic feedback, leading to increased perception of the cube size. This after-effect was absent in the only visual incongruency condition, suggesting that haptic information and multisensory integration played a crucial role in inducing perceptual changes. The results are discussed considering the recent findings in visual-haptic integration during multisensory information processing in real and virtual environments.
Fractal fluctuations in muscular activity contribute to judgments of length but not heaviness via dynamic touch
The applied muscular effort to wield, hold, or balance an object shapes the medium by which action-relevant perceptual judgments (e.g., heaviness, length, width, and shape) are derived. Strikingly, the integrity of these judgments is retained over a range of exploratory conditions, a phenomenon known as perceptual invariance. For instance, judgments of length do not vary with the speed of rotation, despite the greater muscular effort required to wield objects at higher speeds. If not the amount of muscular effort alone, then what features of the neuromuscular activity implicated while wielding objects contribute to perception via dynamic touch? In the present study, we investigated how muscular activity mediates perception of heaviness and length of objects via dynamic touch. We measured EMG activity in biceps brachii and flexor carpi radialis as participants wielded objects of different moments of inertia. We found that variation in the amount of muscular effort (literally, root-mean-square values of EMG activity) predicted variations in judgments of heaviness but not length. In contrast, fluctuations in the activity of biceps brachii and flexor carpi radialis were fractal, and variation in the degree of fractality in the two muscles predicted variation in judgments of length. These findings reflect the distinct implications of dynamic touch for perception of heaviness and length. Perceptions of length can be derived from minimal effort, and muscular effort only shapes the medium from which judgments of length are derived. We discuss our findings in the context of the body as a multifractal tensegrity system, wherein perceptual judgments of length by wielding implicate, at least in part, rapidly diffusing mechanotransduction perturbations cascading across the whole body.
Investigating canonical size phenomenon in drawing from memory task in different perceptual conditions among children
The canonical size phenomenon refers to the mental representation of real-object size information: the objects larger in the physical world are represented as larger in mental spatial representations. This study tested this phenomenon in a drawing-from-memory task among children aged 5, 7, and 9 years. The participants were asked to draw objects whose actual sizes varied at eight size rank levels. Drawings were made on regular paper sheets or special foils to produce embossed drawings. When drawing from memory, the participants were either sighted or blindfolded (to prevent visual feedback). We predicted that the drawn size of objects would increase with increasing size rank of objects. The findings supported the hypothesis concerning the canonical size effect among all age groups tested. This means that children aged 5 to 9 represent real-world size information about everyday objects and are sensitive to their size subtleties. Moreover, the drawn size increased with increasing size ranks both within sighted and blindfolded perceptual conditions (however, the slope of functions that best explain the relation between size rank and drawn size varied between the perceptual conditions). This finding further supports the recent evidence of the spatial character of the canonical size phenomenon.
Horizontal target size perturbations during grasping movements are described by subsequent size perception and saccade amplitude
Perception and action are essential in our day-to-day interactions with the environment. Despite the dual-stream theory of action and perception, it is now accepted that action and perception processes interact with each other. However, little is known about the impact of unpredicted changes of target size during grasping actions on perception. We assessed whether size perception and saccade amplitude were affected before and after grasping a target that changed its horizontal size during the action execution under the presence or absence of tactile feedback. We have tested twenty-one participants in 4 blocks of 30 trials. Blocks were divided into two experimental tactile feedback paradigms: tactile and non-tactile. Trials consisted of 3 sequential phases: pre-grasping size perception, grasping, and post-grasping size perception. During pre- and post-phases, participants executed a saccade towards a horizontal bar and performed a manual size estimation of the bar size. During grasping phase, participants were asked to execute a saccade towards the bar and to make a grasping action towards the screen. While grasping, 3 horizontal size perturbation conditions were applied: non-perturbation, shortening, and lengthening. 30% of the trials presented perturbation, meaning a symmetrically shortened or lengthened by 33% of the original size. Participants’ hand and eye positions were assessed by a motion capture system and a mobile eye-tracker, respectively. After grasping, in both tactile and non-tactile feedback paradigms, size estimation was significantly reduced in lengthening (p = 0.002) and non-perturbation (p<0.001), whereas shortening did not induce significant adjustments (p = 0.86). After grasping, saccade amplitude became significantly longer in shortening (p<0.001) and significantly shorter in lengthening (p<0.001). Non-perturbation condition did not display adjustments (p = 0.95). Tactile feedback did not generate changes in the collected perceptual responses, but horizontal size perturbations did so, suggesting that all relevant target information used in the movement can be extracted from the post-action target perception.
Selective perception in probing by foot: Perceiving the length of a probe and the distance of a probed surface
Perception of properties of object wielded by means of muscular effort exhibits both task specificity and anatomical independence. A person can perceive different properties of an object wielded by a given anatomical component and can perceive a given property of an object wielded by different anatomical components. Task-specificity and anatomical independence are fundamental characteristics of the haptic system described a biotensegrity system embedded in lawfully structured energy arrays. We investigate whether both characteristics are also exhibited when a person attempts to perceive properties by means of a wielded object. Participants used a foot-wielded rod to probe a surface and reported the length of the rod and the distance of the surface probed (on separate sets of trials). The ability to differentiate these properties generalized across anatomical components, and perception of each property by foot was supported by sensitivities to the same invariant mechanical parameters that support perception of each property by hand. The results suggest that the biotensegrity hypothesis applies to perception both of and by means an object attached to the body.