Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
909
result(s) for
"Multisensory integration"
Sort by:
Neural correlates of multisensory integration in the human brain: an ALE meta-analysis
by
Lampert, Angelika
,
Scheliga, Sebastian
,
Rolke, Roman
in
ALE meta-analysis
,
Brain - physiology
,
Brain mapping
2023
Previous fMRI research identified superior temporal sulcus as central integration area for audiovisual stimuli. However, less is known about a general multisensory integration network across senses. Therefore, we conducted activation likelihood estimation meta-analysis with multiple sensory modalities to identify a common brain network. We included 49 studies covering all Aristotelian senses i.e., auditory, visual, tactile, gustatory, and olfactory stimuli. Analysis revealed significant activation in bilateral superior temporal gyrus, middle temporal gyrus, thalamus, right insula, and left inferior frontal gyrus. We assume these regions to be part of a general multisensory integration network comprising different functional roles. Here, thalamus operate as first subcortical relay projecting sensory information to higher cortical integration centers in superior temporal gyrus/sulcus while conflict-processing brain regions as insula and inferior frontal gyrus facilitate integration of incongruent information. We additionally performed meta-analytic connectivity modelling and found each brain region showed co-activations within the identified multisensory integration network. Therefore, by including multiple sensory modalities in our meta-analysis the results may provide evidence for a common brain network that supports different functional roles for multisensory integration.
Journal Article
On the relation between body ownership and sense of agency: A link at the level of sensory-related signals
by
Burin, Dalila
,
Pia, Lorenzo
,
Pyasik, Maria
in
Cognition & reasoning
,
Correlation analysis
,
Proprioception
2018
The relation between sense of body ownership and sense of agency is still highly debated. Here we investigated in a large sample of healthy participants the associations between several implicit and explicit indexes of the two senses. Specifically, we examined the correlations between proprioceptive shift (implicit measure) and questionnaire on the subjective experience of ownership (explicit measure) within the rubber hand illusion paradigm (body ownership), and intentional binding (implicit measure), attenuation of the intensity of auditory outcomes of actions (implicit measure) and questionnaire on the subjective experience of authorship (explicit measure) within the Libet's clock paradigm (sense of agency). Our results showed that proprioceptive shift was positively correlated with the attenuation of auditory outcomes. No significant correlations were found between the explicit measures of the two senses. We argue that the individual spatiotemporal constraints subserving the integration of sensory-related signals (implicit signature) would be common to both senses, whereas their subjective experience (explicit signature) would rely on additional processes specific for any given sense.
Journal Article
What Color is My Arm? Changes in Skin Color of an Embodied Virtual Arm Modulates Pain Threshold
2013
It has been demonstrated that visual inputs can modulate pain. However, the influence of skin color on pain perception is unknown. Red skin is associated to inflamed, hot and more sensitive skin, while blue is associated to cyanotic, cold skin. We aimed to test whether the color of the skin would alter the heat pain threshold. To this end, we used an immersive virtual environment where we induced embodiment of a virtual arm that was co-located with the real one and seen from a first-person perspective. Virtual reality allowed us to dynamically modify the color of the skin of the virtual arm. In order to test pain threshold, increasing ramps of heat stimulation applied on the participants' arm were delivered concomitantly with the gradual intensification of different colors on the embodied avatar's arm. We found that a reddened arm significantly decreased the pain threshold compared with normal and bluish skin. This effect was specific when red was seen on the arm, while seeing red in a spot outside the arm did not decrease pain threshold. These results demonstrate an influence of skin color on pain perception. This top-down modulation of pain through visual input suggests a potential use of embodied virtual bodies for pain therapy.
Journal Article
Effects of Visually Induced Self-Motion on Sound Localization Accuracy
2022
The deterioration of sound localization accuracy during a listener’s head/body rotation is independent of the listener’s rotation velocity. However, whether this deterioration occurs only during physical movement in a real environment remains unclear. In this study, we addressed this question by subjecting physically stationary listeners to visually induced self-motion, i.e., vection. Two conditions—one with a visually induced perception of self-motion (vection) and the other without vection (control)—were adopted. Under both conditions, a short noise burst (30 ms) was presented via a loudspeaker in a circular array placed horizontally in front of a listener. The listeners were asked to determine whether the acoustic stimulus was localized relative to their subjective midline. The results showed that in terms of detection thresholds based on the subjective midline, the sound localization accuracy was lower under the vection condition than under the control condition. This indicates that sound localization can be compromised under visually induced self-motion perception. These findings support the idea that self-motion information is crucial for auditory space perception and can potentially enable the design of dynamic binaural displays requiring fewer computational resources.
Journal Article
Effects of Audiovisual Memory Cues on Working Memory Recall
2021
Previous studies have focused on topics such as multimodal integration and object discrimination, but there is limited research on the effect of multimodal learning in memory. Perceptual studies have shown facilitative effects of multimodal stimuli for learning; the current study aims to determine whether this effect persists with memory cues. The purpose of this study was to investigate the effect that audiovisual memory cues have on memory recall, as well as whether the use of multiple memory cues leads to higher recall. The goal was to orthogonally evaluate the effect of the number of self-generated memory cues (one or three), and the modality of the self-generated memory-cue (visual: written words, auditory: spoken words, or audiovisual). A recall task was administered where participants were presented with their self-generated memory cues and asked to determine the target word. There was a significant main effect for number of cues, but no main effect for modality. A secondary goal of this study was to determine which types of memory cues result in the highest recall. Self-reference cues resulted in the highest accuracy score. This study has applications to improving academic performance by using the most efficient learning techniques.
Journal Article
Top down influence on visuo-tactile interaction modulates neural oscillatory responses
2012
Multisensory integration involves bottom-up as well as top-down processes. We investigated the influences of top-down control on the neural responses to multisensory stimulation using EEG recording and time-frequency analyses. Participants were stimulated at the index or thumb of the left hand, using tactile vibrators mounted on a foam cube. Simultaneously they received a visual distractor from a light emitting diode adjacent to the active vibrator (spatially congruent trial) or adjacent to the inactive vibrator (spatially incongruent trial). The task was to respond to the elevation of the tactile stimulus (upper or lower), while ignoring the simultaneous visual distractor. To manipulate top-down control on this multisensory stimulation, the proportion of spatially congruent (vs. incongruent) trials was changed across blocks. Our results reveal that the behavioral cost of responding to incongruent than congruent trials (i.e., the crossmodal congruency effect) was modulated by the proportion of congruent trials. Most importantly, the EEG gamma band response and the gamma–theta coupling were also affected by this modulation of top-down control, whereas the late theta band response related to the congruency effect was not. These findings suggest that gamma band response is more than a marker of multisensory binding, being also sensitive to the correspondence between expected and actual multisensory stimulation. By contrast, theta band response was affected by congruency but appears to be largely immune to stimulation expectancy.
► We investigated the top-down modulation on visuo-tactile integration using EEG. ►Gamma band oscillation at the parietal area has a role of the top-down modulation. ►Cross-frequency (gamma–theta) coupling is also related to the top-down modulation.
Journal Article
Intentional Binding Without Intentional Action
2019
The experience of authorship over one’s actions and their consequences—sense of agency—is a fundamental aspect of conscious experience. In recent years, it has become common to use intentional binding as an implicit measure of the sense of agency. However, it remains contentious whether reported intentional-binding effects indicate the role of intention-related information in perception or merely represent a strong case of multisensory causal binding. Here, we used a novel virtual-reality setup to demonstrate identical magnitude-binding effects in both the presence and complete absence of intentional action, when perceptual stimuli were matched for temporal and spatial information. Our results demonstrate that intentional-binding-like effects are most simply accounted for by multisensory causal binding without necessarily being related to intention or agency. Future studies that relate binding effects to agency must provide evidence for effects beyond that expected for multisensory causal binding by itself.
Journal Article
Multisensory Integration as per Technological Advances: A Review
by
Velasco, Carlos
,
Obrist, Marianna
,
Cornelio, Patricia
in
Acoustics
,
Behavior
,
human–computer interaction
2021
Multisensory integration research has allowed us to better understand how humans integrate sensory information to produce a unitary experience of the external world. However, this field is often challenged by the limited ability to deliver and control sensory stimuli, especially when going beyond audio–visual events and outside laboratory settings. In this review, we examine the scope and challenges of new technology in the study of multisensory integration in a world that is increasingly characterized as a fusion of physical and digital/virtual events. We discuss multisensory integration research through the lens of novel multisensory technologies and, thus, bring research in human–computer interaction, experimental psychology, and neuroscience closer together. Today, for instance, displays have become volumetric so that visual content is no longer limited to 2D screens, new haptic devices enable tactile stimulation without physical contact, olfactory interfaces provide users with smells precisely synchronized with events in virtual environments, and novel gustatory interfaces enable taste perception through levitating stimuli. These technological advances offer new ways to control and deliver sensory stimulation for multisensory integration research beyond traditional laboratory settings and open up new experimentations in naturally occurring events in everyday life experiences. Our review then summarizes these multisensory technologies and discusses initial insights to introduce a bridge between the disciplines in order to advance the study of multisensory integration.
Journal Article
Multisensory interactions regulate feeding behavior in Drosophila
2021
The integration of two or more distinct sensory cues can help animals make more informed decisions about potential food sources, but little is known about how feeding-related multimodal sensory integration happens at the cellular and molecular levels. Here, we show that multimodal sensory integration contributes to a stereotyped feeding behavior in the model organism Drosophila melanogaster. Simultaneous olfactory and mechanosensory inputs significantly influence a taste-evoked feeding behavior called the proboscis extension reflex (PER). Olfactory and mechanical information are mediated by antennal Or35a neurons and leg hair plate mechanosensory neurons, respectively. We show that the controlled delivery of three different sensory cues can produce a supra-additive PER via the concurrent stimulation of olfactory, taste, and mechanosensory inputs. We suggest that the fruit fly is a versatile model system to study multisensory integration related to feeding, which also likely exists in vertebrates.
Journal Article
Spatial tuning of electrophysiological responses to multisensory stimuli reveals a primitive coding of the body boundaries in newborns
by
Garbarini, Francesca
,
Gazzin, Andrea
,
Perathoner, Cristina
in
Biological Sciences
,
BRIEF REPORTS
,
Psychological and Cognitive Sciences
2021
The ability to identify our own body and its boundaries is crucial for survival. Ideally, the sooner we learn to discriminate external stimuli occurring close to our body from those occurring far from it, the better (and safer) we may interact with the sensory environment. However, when this mechanism emerges within ontogeny is unknown. Is it something acquired throughout infancy, or is it already present soon after birth? The presence of a spatial modulation of multisensory integration (MSI) is considered a hallmark of a functioning representation of the body position in space. Here, we investigated whether MSI is present and spatially organized in 18- to 92-h-old newborns. We compared electrophysiological responses to tactile stimulation when concurrent auditory events were delivered close to, as opposed to far from, the body in healthy newborns and in a control group of adult participants. In accordance with previous studies, adult controls showed a clear spatial modulation of MSI, with greater superadditive responses for multisensory stimuli close to the body. In newborns, we demonstrated the presence of a genuine electrophysiological pattern of MSI, with older newborns showing a larger MSI effect. Importantly, as for adults, multisensory superadditive responses were modulated by the proximity to the body. This finding may represent the electrophysiological mechanism responsible for a primitive coding of bodily self boundaries, thus suggesting that even just a few hours after birth, human newborns identify their own body as a distinct entity from the environment.
Journal Article