Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
LanguageLanguage
-
SubjectSubject
-
Item TypeItem Type
-
DisciplineDiscipline
-
YearFrom:-To:
-
More FiltersMore FiltersIs Peer Reviewed
Done
Filters
Reset
16
result(s) for
"Keetels, Mirjam"
Sort by:
Perception of intersensory synchrony: A tutorial review
by
Keetels, Mirjam
,
Vroomen, Jean
in
Adaptation, Biological - physiology
,
Auditory Stimuli
,
Behavioral Science and Psychology
2010
For most multisensory events, observers perceive synchrony among the various senses (vision, audition, touch), despite the naturally occurring lags in arrival and processing times of the different information streams. A substantial amount of research has examined how the brain accomplishes this. In the present article, we review several key issues about intersensory timing, and we identify four mechanisms of how intersensory lags might be dealt with: by ignoring lags up to some point (a wide window of temporal integration), by compensating for predictable variability, by adjusting the point of perceived synchrony on the longer term, and by shifting one stream directly toward the other.
Journal Article
Audio-motor but not visuo-motor temporal recalibration speeds up sensory processing
by
Sugano, Yoshimori
,
Keetels, Mirjam
,
Vroomen, Jean
in
Biology and Life Sciences
,
Engineering and Technology
,
Medicine and Health Sciences
2017
Perception of synchrony between one's own action (a finger tap) and the sensory feedback thereof (a visual flash or an auditory pip) can be recalibrated after exposure to an artificially inserted delay between them (temporal recalibration effect: TRE). TRE might be mediated by a compensatory shift of motor timing (when did I tap?) and/or the sensory timing of the feedback (when did I hear/see the feedback?). To examine this, we asked participants to voluntarily tap their index finger at a constant pace while receiving visual or auditory feedback (a flash or pip) that was either synced or somewhat delayed relative to the tap. Following this exposure phase, they then performed a simple reaction time (RT) task to measure the sensory timing of the exposure stimulus, and a sensorimotor synchronization (SMS) task (tapping in synchrony with a flash or pip as pacing stimulus) to measure the point of subjective synchrony between the tap and pacing stimulus. The results showed that after exposure to delayed auditory feedback, participants tapped earlier (~21.5 ms) relative to auditory pacing stimuli (= temporal recalibration) and reacted faster (~5.6 ms) to auditory stimuli. For visual exposure and test stimuli, there were no such compensatory effects. These results indicate that adjustments of audio-motor synchrony can to some extent be explained by a change in the speed of auditory sensory processing. We discuss this in terms of an attentional modulation of sensory processing.
Journal Article
Reading-induced shifts of perceptual speech representations in auditory cortex
by
Vroomen, Jean
,
Formisano, Elia
,
Correia, Joao M.
in
59/36
,
631/378/2649/1594
,
631/378/2649/1723
2017
Learning to read requires the formation of efficient neural associations between written and spoken language. Whether these associations influence the auditory cortical representation of speech remains unknown. Here we address this question by combining multivariate functional MRI analysis and a newly-developed ‘text-based recalibration’ paradigm. In this paradigm, the pairing of visual text and ambiguous speech sounds shifts (i.e. recalibrates) the perceptual interpretation of the ambiguous sounds in subsequent auditory-only trials. We show that it is possible to retrieve the text-induced perceptual interpretation from fMRI activity patterns in the posterior superior temporal cortex. Furthermore, this auditory cortical region showed significant functional connectivity with the inferior parietal lobe (IPL) during the pairing of text with ambiguous speech. Our findings indicate that reading-related audiovisual mappings can adjust the auditory cortical representation of speech in typically reading adults. Additionally, they suggest the involvement of the IPL in audiovisual and/or higher-order perceptual processes leading to this adjustment. When applied in typical and dyslexic readers of different ages, our text-based recalibration paradigm may reveal relevant aspects of perceptual learning and plasticity during successful and failing reading development.
Journal Article
Auditory dominance in motor-sensory temporal recalibration
by
Sugano, Yoshimori
,
Keetels, Mirjam
,
Vroomen, Jean
in
Acoustic Stimulation
,
Adaptation
,
Analysis of Variance
2016
Perception of synchrony between one’s own action (e.g. a finger tap) and the sensory feedback thereof (e.g. a flash or click) can be shifted after exposure to an induced delay (temporal recalibration effect, TRE). It remains elusive, however, whether the same mechanism underlies motor-visual (MV) and motor-auditory (MA) TRE. We examined this by measuring crosstalk between MV- and MA-delayed feedbacks. During an exposure phase, participants pressed a mouse at a constant pace while receiving visual or auditory feedback that was either delayed (+150 ms) or subjectively synchronous (+50 ms). During a post-test, participants then tried to tap in sync with visual or auditory pacers. TRE manifested itself as a compensatory shift in the tap–pacer asynchrony (a larger anticipation error after exposure to delayed feedback). In experiment 1, MA and MV feedback were either both synchronous (MV-sync and MA-sync) or both delayed (MV-delay and MA-delay), whereas in experiment 2, different delays were mixed across alternating trials (MV-sync and MA-delay or MV-delay and MA-sync). Exposure to consistent delays induced equally large TREs for auditory and visual pacers with similar build-up courses. However, with mixed delays, we found that synchronized sounds erased MV-TRE, but synchronized flashes did not erase MA-TRE. These results suggest that similar mechanisms underlie MA- and MV-TRE, but that auditory feedback is more potent than visual feedback to induce a rearrangement of motor-sensory timing.
Journal Article
Adaptation to motor-visual and motor-auditory temporal lags transfer across modalities
by
Sugano, Yoshimori
,
Keetels, Mirjam
,
Vroomen, Jean
in
Acoustic Stimulation
,
Action
,
Adaptation
2010
Previous research has shown that the timing of a sensor-motor event is recalibrated after a brief exposure to a delayed feedback of a voluntary action (Stetson et al.
2006
). Here, we examined whether it is the sensory or motor event that is shifted in time. We compared lag adaption for action-feedback in visuo-motor pairs and audio-motor pairs using an adaptation-test paradigm. Participants were exposed to a constant lag (50 or 150 ms) between their voluntary action (finger tap) and its sensory feedback (flash or tone pip) during an adaptation period (~3 min). Immediately after that, they performed a temporal order judgment (TOJ) task about the tap-feedback test stimulus pairings. The modality of the feedback stimulus was either the same as the adapted one (within-modal) or different (cross-modal). The results showed that the point of subjective simultaneity (PSS) was uniformly shifted in the direction of the exposed lag within and across modalities (motor-visual, motor-auditory). This suggests that the TRE of sensor-motor events is mainly caused by a shift in the motor component.
Journal Article
Effect of pitch–space correspondence on sound-induced visual motion perception
by
Hidaka, Souta
,
Vroomen, Jean
,
Keetels, Mirjam
in
Acoustic Stimulation
,
Analysis of Variance
,
Apparent motion
2013
The brain tends to associate specific features of stimuli across sensory modalities. The pitch of a sound is for example associated with spatial elevation such that higher-pitched sounds are felt as being “up” in space and lower-pitched sounds as being “down.” Here we investigated whether changes in the pitch of sounds could be effective for visual motion perception similar to those in the location of sounds. We demonstrated that only sounds that alternate in up/down location induced illusory vertical motion of a static visual stimulus, while sounds that alternate in higher/lower pitch did not induce this illusion. The pitch of a sound did not even modulate the visual motion perception induced by sounds alternating in up/down location. Interestingly, though, sounds alternating in higher/lower pitch could become a driver for visual motion if they were paired in a previous exposure phase with vertical visual apparent motion. Thus, only after prolonged exposure, the pitch of a sound became an inducer for upper/lower visual motion. This occurred even if during exposure the pitch and location of the sounds were paired in an incongruent fashion. These findings indicate that pitch–space correspondence is not so strong to drive or modulate visual motion perception. However, associative exposure could increase the saliency of pitch–space relationships and then the pitch could induce visual motion perception by itself.
Journal Article
Motor-induced visual motion: hand movements driving visual motion perception
2014
Visual perception can be changed by co-occurring input from other sensory modalities. Here, we explored how self-generated finger movements (left–right or up–down key presses) affect visual motion perception. In Experiment 1, motion perception of a blinking bar was shifted in the direction of co-occurring hand motor movements, indicative of motor-induced visual motion (MIVM). In Experiment 2, moving and static blinking bars were combined with either directional moving or stationary hand motor movements. Results showed that the directional component in the hand movement was crucial for MIVM as stationary motor movements even declined visual motion perception. In Experiment 3, the role of response bias was excluded in a two-alternative forced-choice task that ruled out the effect of response strategies. All three experiments demonstrated that alternating key presses (either horizontally or vertically aligned) induce illusory visual motion and that stationary motor movements (without a vertical or horizontal direction) induce the opposite effect, namely a decline in visual motion (more static) perception.
Journal Article
The role of spatial disparity and hemifields in audio-visual temporal order judgments
by
Keetels, Mirjam
,
Vroomen, Jean
in
Acoustic Stimulation
,
Adult
,
Auditory Perception - physiology
2005
We explored whether sensitivity to audio-visual temporal order judgments (TOJs) was affected by the amount of spatial separation between a sound and light, and by whether the sound and light were presented in the same or in different hemifields. Participants made TOJs about noise bursts and light flashes, and judged whether the stimuli came from the same location or not. Flashes were presented either in the left or right hemifield (at +/-10 degrees from central fixation), and sounds either came from the same location as the lights, or at small or large disparities (20 or 40 degrees from the light, respectively), thereby crossing the hemifields or not. TOJs became more accurate (i.e., the just noticeable difference, JND, became smaller) when spatial disparity increased and when hemifields were crossed. Location discrimination of the sound and light was affected similarly. These results demonstrate that audio-visual TOJs are critically dependent on both the relative position from which stimuli are presented and on whether stimuli cross hemifields or not.
Journal Article
Auditory grouping occurs prior to intersensory pairing: evidence from temporal ventriloquism
by
Stekelenburg, Jeroen
,
Keetels, Mirjam
,
Vroomen, Jean
in
Acoustic Stimulation
,
Attention - physiology
,
Biological and medical sciences
2007
The authors examined how principles of auditory grouping relate to intersensory pairing. Two sounds that normally enhance sensitivity on a visual temporal order judgement task (i.e. temporal ventriloquism) were embedded in a sequence of flanker sounds which either had the same or different frequency (Exp. 1), rhythm (Exp. 2), or location (Exp. 3). In all experiments, we found that temporal ventriloquism only occurred when the two capture sounds differed from the flankers, demonstrating that grouping of the sounds in the auditory stream took priority over intersensory pairing. By combining principles of auditory grouping with intersensory pairing, we demonstrate that capture sounds were, counter-intuitively, more effective when their locations differed from that of the lights rather than when they came from the same position as the lights.
Journal Article
Exposure to delayed visual feedback of the hand changes motor-sensory synchrony perception
by
Keetels, Mirjam
,
Vroomen, Jean
in
Acoustic Stimulation - methods
,
Action
,
Biological and medical sciences
2012
We examined whether the brain can adapt to temporal delays between a self-initiated action and the naturalistic visual feedback of that action. During an exposure phase, participants tapped with their index finger while seeing their own hand in real time (~0 ms delay) or delayed at 40, 80, or 120 ms. Following exposure, participants were tested with a simultaneity judgment (SJ) task in which they judged whether the video of their hand was synchronous or asynchronous with respect to their finger taps. The locations of the seen and the real hand were either different (Experiment 1) or aligned (Experiment 2). In both cases, the point of subjective simultaneity (PSS) was uniformly shifted in the direction of the exposure lags while sensitivity to visual-motor asynchrony decreased with longer exposure delays. These findings demonstrate that the brain is quite flexible in adjusting the timing relation between a motor action and the otherwise naturalistic visual feedback that this action engenders.
Journal Article