Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Reading LevelReading Level
-
Content TypeContent Type
-
YearFrom:-To:
-
More FiltersMore FiltersItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
212
result(s) for
"Spivey, Michael"
Sort by:
The Cambridge handbook of psycholinguistics
\"Our ability to speak, write, understand speech, and read is critical to our ability to function in today's society. As such, psycholinguistics, or the study of how humans learn and use language, is a central topic in cognitive science. This comprehensive handbook is a collection of chapters written not by practitioners in the field, who can summarize the work going on around them, but by trailblazers from a wide array of subfields, who have been shaping the field of psycholinguistics over the last decade. Some topics discussed include how children learn language, how average adults understand and produce language, how language is represented in the brain, how brain-damaged individuals perform in terms of their language abilities, and computer-based models of language and meaning. This is required reading for advanced researchers, graduate students, and upper-level undergraduates who are interested in the recent developments and the future of psycholinguistics\"-- Provided by publisher.
Biasing moral decisions by exploiting the dynamics of eye gaze
2015
Eye gaze is a window onto cognitive processing in tasks such as spatial memory, linguistic processing, and decision making. We present evidence that information derived from eye gaze can be used to change the course of individuals’ decisions, even when they are reasoning about high-level, moral issues. Previous studies have shown that when an experimenter actively controls what an individual sees the experimenter can affect simple decisions with alternatives of almost equal valence. Here we show that if an experimenter passively knows when individuals move their eyes the experimenter can change complex moral decisions. This causal effect is achieved by simply adjusting the timing of the decisions. We monitored participants’ eye movements during a two-alternative forced-choice task with moral questions. One option was randomly predetermined as a target. At the moment participants had fixated the target option for a set amount of time we terminated their deliberation and prompted them to choose between the two alternatives. Although participants were unaware of this gaze-contingent manipulation, their choices were systematically biased toward the target option. We conclude that even abstract moral cognition is partly constituted by interactions with the immediate environment and is likely supported by gaze-dependent decision processes. By tracking the interplay between individuals, their sensorimotor systems, and the environment, we can influence the outcome of a decision without directly manipulating the content of the information available to them.
Significance Where people look generally reflects and reveals their moment-by-moment thought processes. This study introduces an experimental method whereby participants’ eye gaze is monitored and information about their gaze is used to change the timing of their decisions. Answers to difficult moral questions such as “Is murder justifiable?” can be influenced toward random alternatives based on looking patterns alone. We do this without presenting different arguments or response frames, as in other techniques of persuasion. Thus, the process of arriving at a moral decision is not only reflected in a participant’s eye gaze but can also be determined by it.
Journal Article
Coordination dynamics of multi-agent interaction in a musical ensemble
by
Proksch, Shannon
,
Spivey, Michael
,
Balasubramaniam, Ramesh
in
631/477
,
631/477/2811
,
Humanities and Social Sciences
2022
Humans interact with other humans at a variety of timescales and in a variety of social contexts. We exhibit patterns of coordination that may differ depending on whether we are genuinely interacting as part of a coordinated group of individuals vs merely co-existing within the same physical space. Moreover, the local coordination dynamics of an interacting pair of individuals in an otherwise non-interacting group may spread, propagating change in the global coordination dynamics and interaction of an entire crowd. Dynamical systems analyses, such as Recurrence Quantification Analysis (RQA), can shed light on some of the underlying coordination dynamics of multi-agent human interaction. We used RQA to examine the coordination dynamics of a performance of “Welcome to the Imagination World”, composed for wind orchestra. This performance enacts a real-life simulation of the transition from uncoordinated, non-interacting individuals to a coordinated, interacting multi-agent group. Unlike previous studies of social interaction in musical performance which rely on different aspects of video and/or acoustic data recorded from each individual, this project analyzes group-level coordination patterns solely from the group-level acoustic data of an audio recording of the performance. Recurrence and stability measures extracted from the audio recording increased when musicians coordinated as an interacting group. Variability in these measures also increased, indicating that the interacting ensemble of musicians were able to explore a greater variety of behavior than when they performed as non-interacting individuals. As an orchestrated (non-emergent) example of coordination, we believe these analyses provide an indication of approximate expected distributions for recurrence patterns that may be measurable before and after truly emergent coordination.
Journal Article
A potential mechanism for Gibsonian resonance: behavioral entrainment emerges from local homeostasis in an unsupervised reservoir network
by
Falandays, J. Benjamin
,
Warren, William H.
,
Spivey, Michael J.
in
Agents (artificial intelligence)
,
Artificial Intelligence
,
Behavior
2024
While the cognitivist school of thought holds that the mind is analogous to a computer, performing logical operations over internal representations, the tradition of ecological psychology contends that organisms can directly “resonate” to information for action and perception without the need for a representational intermediary. The concept of resonance has played an important role in ecological psychology, but it remains a metaphor. Supplying a mechanistic account of resonance requires a non-representational account of central nervous system (CNS) dynamics. Towards this, we present a series of simple models in which a reservoir network with homeostatic nodes is used to control a simple agent embedded in an environment. This network spontaneously produces behaviors that are adaptive in each context, including (1) visually tracking a moving object, (2) substantially above-chance performance in the arcade game
Pong
, (2) and avoiding walls while controlling a mobile agent. Upon analyzing the dynamics of the networks, we find that behavioral stability can be maintained
without
the formation of stable or recurring patterns of network activity that could be identified as neural representations. These results may represent a useful step towards a mechanistic grounding of resonance and a view of the CNS that is compatible with ecological psychology.
Journal Article
Making the Invisible Visible: Verbal but Not Visual Cues Enhance Visual Detection
2010
Can hearing a word change what one sees? Although visual sensitivity is known to be enhanced by attending to the location of the target, perceptual enhancements of following cues to the identity of an object have been difficult to find. Here, we show that perceptual sensitivity is enhanced by verbal, but not visual cues.
Participants completed an object detection task in which they made an object-presence or -absence decision to briefly-presented letters. Hearing the letter name prior to the detection task increased perceptual sensitivity (d'). A visual cue in the form of a preview of the to-be-detected letter did not. Follow-up experiments found that the auditory cuing effect was specific to validly cued stimuli. The magnitude of the cuing effect positively correlated with an individual measure of vividness of mental imagery; introducing uncertainty into the position of the stimulus did not reduce the magnitude of the cuing effect, but eliminated the correlation with mental imagery.
Hearing a word made otherwise invisible objects visible. Interestingly, seeing a preview of the target stimulus did not similarly enhance detection of the target. These results are compatible with an account in which auditory verbal labels modulate lower-level visual processing. The findings show that a verbal cue in the form of hearing a word can influence even the most elementary visual processing and inform our understanding of how language affects perception.
Journal Article
Continuous Dynamics in Real-Time Cognition
2006
Real-time cognition is best described not as a sequence of logical operations performed on discrete symbols but as a continuously changing pattern of neuronal activity. The continuity in these dynamics indicates that, in between describable states of mind, mental activity does not lend itself to the linguistic labels relied on by much of psychology. We discuss eye-tracking and mouse-tracking evidence for this temporal continuity and provide geometric visualizations of mental activity, depicting it as a continuous trajectory through a state space (a multi-dimensional space in which locations correspond to mental states). When the state of the system travels toward a frequently visited region of that space, the destination may constitute recognition of a particular word or a particular object; but on the way there, the majority of the mental trajectory is in intermediate regions of that space, revealing graded mixtures of mental states.
Journal Article
Continuous Attraction toward Phonological Competitors
by
Spivey, Michael J.
,
Grosjean, Marc
,
Knoblich, Günther
in
Adult
,
Biological Sciences
,
Cognition
2005
Certain models of spoken-language processing, like those for many other perceptual and cognitive processes, posit continuous uptake of sensory input and dynamic competition between simultaneously active representations. Here, we provide compelling evidence for this continuity assumption by using a continuous response, hand movements, to track the temporal dynamics of lexical activations during real-time spoken-word recognition in a visual context. By recording the streaming x, y coordinates of continuous goal-directed hand movement in a spoken-language task, online accrual of acoustic-phonetic input and competition between partially active lexical representations are revealed in the shape of the movement trajectories. This hand-movement paradigm allows one to project the internal processing of spoken-word recognition onto a two-dimensional layout of continuous motor output, providing a concrete visualization of the attractor dynamics involved in language processing.
Journal Article
Graded motor responses in the time course of categorizing atypical exemplars
by
Kehoe, Caitlin
,
Spivey, Michael J.
,
Dale, Rick
in
Activity levels. Psychomotricity
,
Animal cognition
,
Behavior
2007
The time course of categorization was investigated in four experiments, which revealed graded competitive effects in a categorization task. Participants clicked one of two categories (e.g., mammal or fish) in response to atypical or typical exemplars (e.g., whale or cat) in the form of words (Experiments 1 and 2) or pictures (Experiments 3 and 4). Streaming x, y coordinates of mouse movement trajectories were recorded. Normalized mean trajectories revealed a graded competitive process: Atypical exemplars produced trajectories with greater curvature toward the competing category than did typical exemplars. The experiments contribute to recent examination of the time course of categorization and carry implications for theories of representation in cognitive science.
Journal Article
Redundant spoken labels facilitate perception of multiple items
by
Lupyan, Gary
,
Spivey, Michael J.
in
Attention
,
Auditory Processing
,
Behavioral Science and Psychology
2010
Because of the strong associations between verbal labels and the visual objects that they denote, hearing a word may quickly guide the deployment of visual attention to the named objects. We report six experiments in which we investigated the effect of hearing redundant (noninformative) object labels on the visual processing of multiple objects from the named category. Even though the word cues did not provide additional information to the participants, hearing a label resulted in faster detection of attention probes appearing near the objects denoted by the label. For example, hearing the word
chair
resulted in more effective visual processing of all of the chairs in a scene relative to trials in which the participants attended to the chairs without actually hearing the label. This facilitation was mediated by stimulus typicality. Transformations of the stimuli that disrupted their association with the label while preserving the low-level visual features eliminated the facilitative effect of the labels. In the final experiment, we show that hearing a label improves the accuracy of locating multiple items matching the label, even when eye movements are restricted. We posit that verbal labels dynamically modulate visual processing via top-down feedback-an instance of linguistic labels greasing the wheels of perception.
Journal Article