Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
9
result(s) for
"Addleman, Douglas A."
Sort by:
Visual and Auditory Spatial Localization in Younger and Older Adults
2022
Visual and auditory localization abilities are crucial in real-life tasks such as navigation and social interaction. Aging is frequently accompanied by vision and hearing loss, affecting spatial localization. The purpose of the current study is to elucidate the effect of typical aging on spatial localization and to establish a baseline for older individuals with pathological sensory impairment. Using a verbal report paradigm, we investigated how typical aging affects visual and auditory localization performance, the reliance on vision during sound localization, and sensory integration strategies when localizing audiovisual targets. Fifteen younger adults (N = 15, mean age = 26 yrs) and thirteen older adults (N = 13, mean age = 68 yrs) participated in this study, all with age-adjusted normal vision and hearing based on clinical standards. There were significant localization differences between younger and older adults, with the older group missing peripheral visual stimuli at significantly higher rates, localizing central stimuli as more peripheral, and being less precise in localizing sounds from central locations when compared to younger subjects. Both groups localized auditory targets better when the test space was visible compared to auditory localization when blindfolded. The two groups also exhibited similar patterns of audiovisual integration, showing optimal integration in central locations that was consistent with a Maximum-Likelihood Estimation model, but non-optimal integration in peripheral locations. These findings suggest that, despite the age-related changes in auditory and visual localization, the interactions between vision and hearing are largely preserved in older individuals without pathological sensory impairments.
Journal Article
No evidence for proactive suppression of explicitly cued distractor features
by
Addleman, Douglas A.
,
Störmer, Viola S.
in
Behavioral Science and Psychology
,
Brief Report
,
Cognitive Psychology
2022
Visual search benefits from advance knowledge of nontarget features. However, it is unknown whether these negatively cued features are suppressed in advance (proactively) or during search (reactively). To test this, we presented color cues varying from trial-to-trial that predicted target or nontarget colors. Experiment
1
(
N
= 96) showed that both target and nontarget cues speeded search. To test whether attention proactively modified cued feature representations, in Experiment
2
(
N
= 200), we interleaved color probe and search trials and had participants detect the color of a briefly presented ring that could either match the cued color or not. People detected positively cued colors better than other colors, whereas negatively cued colors were detected no better or worse than other colors. These results demonstrate that nontarget features are not suppressed proactively, and instead suggest that anticipated nontarget features are ignored via reactive mechanisms.
Journal Article
Distractor ignoring is as effective as target enhancement when incidentally learned but not when explicitly cued
by
Addleman, Douglas A.
,
Störmer, Viola S.
in
Accuracy
,
Attention
,
Behavioral Science and Psychology
2023
Explicit knowledge about upcoming target or distractor features can increase performance in tasks like visual search. However, explicit distractor cues generally result in smaller performance benefits than target cues, suggesting that suppressing irrelevant information is less effective than enhancing relevant information. Is this asymmetry a general principle of feature-based attention? Across four experiments (
N
= 75 each) we compared the efficiency of target selection and distractor ignoring through either incidental experience or explicit instructions. Participants searched for an orientation-defined target amidst seven distractors—three in the target color and four in another color. In Experiment
1
, either targets (Exp.
1a
) or distractors (Exp.
1b
) were presented more often in a specific color than other possible search colors. Response times showed comparable benefits of learned attention towards (Exp.
1a
) and away from (Exp.
1b
) the frequent color, suggesting that learned target selection and distractor ignoring can be equally effective. In Experiment
2
, participants completed a nearly identical task, only with explicit cues to the target (Exp.
2a
) or distractor color (Exp.
2b
), inducing voluntary attention. Both target and distractor cues were beneficial for search performance, but distractor cues much less so than target cues, consistent with previous results. Cross-experiment analyses verified that the relative inefficiency of distractor ignoring versus target selection is a unique characteristic of voluntary attention that is not shared by incidentally learned attention, pointing to dissociable mechanisms of voluntary and learned attention to support distractor ignoring.
Journal Article
Simulated central vision loss does not impair implicit location probability learning when participants search through simple displays
2022
Central vision loss disrupts voluntary shifts of spatial attention during visual search. Recently, we reported that a simulated scotoma impaired learned spatial attention towards regions likely to contain search targets. In that task, search items were overlaid on natural scenes. Because natural scenes can induce explicit awareness of learned biases leading to voluntary shifts of attention, here we used a search display with a blank background less likely to induce awareness of target location probabilities. Participants searched both with and without a simulated central scotoma: a training phase contained targets more often in one screen quadrant and a testing phase contained targets equally often in all quadrants. In Experiment 1, training used no scotoma, while testing alternated between blocks of scotoma and no-scotoma search. Experiment 2 training included the scotoma and testing again alternated between scotoma and no-scotoma search. Response times and saccadic behaviors in both experiments showed attentional biases towards the high-probability target quadrant during scotoma and no-scotoma search. Whereas simulated central vision loss impairs learned spatial attention in the context of natural scenes, our results show that this may not arise from impairments to the basic mechanisms of attentional learning indexed by visual search tasks without scenes.
Journal Article
Implicit location probability learning does not induce baseline shifts of visuospatial attention
by
Remington, Roger W.
,
Addleman, Douglas A.
,
Jiang, Yuhong V.
in
Behavioral Science and Psychology
,
Bias
,
Brief Report
2019
We tested whether implicit learning causes shifts of spatial attention in advance of or in response to stimulus onset. Participants completed randomly interspersed trials of letter search, which involved reporting the orientation of a
T
among
L
s, and scene search, which involved identifying which of four scenes was from a target category (e.g., forest). In Experiment
1
, an initial phase more often contained target letters in one screen quadrant, while the target scenes appeared equally often in all quadrants. Participants persistently prioritized letter targets in the more probable region, but the implicitly learned preference did not affect the unbiased scene task. In Experiment
2
, the spatial probabilities of the scene and letter tasks reversed. Participants unaware of the probability manipulation acquired only a spatial bias to scene targets in the more probable region, with no effect on letter search. Instead of recruiting baseline shifts of spatial attention prior to stimulus onset, implicit learning of target probability yields task-dependent shifts of spatial attention following stimulus onset. Such shifts may involve attentional behaviors unique to certain task contexts.
Journal Article
The Effects of Selection History on Visual and Auditory Spatial Attention
2020
Past research has demonstrated implicit experience-driven effects on spatial attention in vision and audition. In particular, what and where an observer has attended in the past affects future attentional selection. For instance, attention while searching for an item is biased towards locations which contained recent targets—an effect called inter-trial location priming—as well as towards locations which contain targets more often than other regions over a span of time—an effect called location probability learning. In this dissertation, I present three studies investigating selection history effects and how they differ from the better-understood goal-driven form of attention. The first two studies investigate the relationship between spatial selection history and top-down attention during visual search. Study 1 investigated how attending to spatial locations during a visual search task for letters affected a secondary memory task for scenes presented underneath the search array. Implicit location probability learning and goal-driven attention both affected search performance, but only goal-driven attention affected memory for scenes at attended locations. This suggests that implicitly learned probability learning has task-specific effects on attention, while goal-driven attention has task-general effects. Study 2 showed that, unlike goal-driven attention, implicit location probability learning causes shifts of visuospatial attention only after search stimuli appear, not in anticipation of stimulus onset. Study 3 investigated short-term and long-term auditory selection history effects, finding long-term location probability learning but a striking lack of short-term inter-trial location priming. Taken together, this dissertation provides evidence for differences in the implementation of goal-driven and implicitly learned spatial attention that, while present in both vision and audition, manifest in modality-specific ways.
Dissertation