Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Reading LevelReading Level
-
Content TypeContent Type
-
YearFrom:-To:
-
More FiltersMore FiltersItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
71
result(s) for
"Patel, Aniruddh D"
Sort by:
The Evolutionary Biology of Musical Rhythm: Was Darwin Wrong?
2014
In The Descent of Man, Darwin speculated that our capacity for musical rhythm reflects basic aspects of brain function broadly shared among animals. Although this remains an appealing idea, it is being challenged by modern cross-species research. This research hints that our capacity to synchronize to a beat, i.e., to move in time with a perceived pulse in a manner that is predictive and flexible across a broad range of tempi, may be shared by only a few other species. Is this really the case? If so, it would have important implications for our understanding of the evolution of human musicality.
Journal Article
Beat-based dancing to music has evolutionary foundations in advanced vocal learning
2024
Dancing to music is ancient and widespread in human cultures. While dance shows great cultural diversity, it often involves nonvocal rhythmic movements synchronized to musical beats in a predictive and tempo-flexible manner. To date, the only nonhuman animals known to spontaneously move to music in this way are parrots. This paper proposes that human-parrot similarities in movement to music and in the neurobiology of advanced vocal learning hold clues to the evolutionary foundations of human dance. The proposal draws on recent research on the neurobiology of parrot vocal learning by Jarvis and colleagues and on a recent cortical model for speech motor control by Hickock and colleagues. These two lines of work are synthesized to suggest that gene regulation changes associated with the evolution of a dorsal laryngeal pitch control pathway in ancestral humans fortuitously strengthened auditory-parietal cortical connections that support beat-based rhythmic processing. More generally, the proposal aims to explain how and why the evolution of strong forebrain auditory-motor integration in the service of learned
vocal
control led to a capacity and proclivity to synchronize
nonvocal
movements to the beat. The proposal specifies cortical brain pathways implicated in the origins of human beat-based dancing and leads to testable predictions and suggestions for future research.
Journal Article
Response to commentaries by Schmidt and Kaplan, Penhune, Hickok and Theofanopoulou on “Beat-based dancing to music has evolutionary foundations in advanced vocal learning.”
2024
Each commentary on my article raises important points and new ideas for research on rhythmic processing in humans and other species. Here I respond to points concerning the role of social factors in the ontogeny of beat synchronization, the neural connectivity underlying beat synchronization, the evolution of this connectivity, and the mechanisms by which evolutionary changes in the strength of one white matter tract (driven by natural selection) can have knock-on effects on the structure of an adjacent tract.
Journal Article
Music and Language Syntax Interact in Broca’s Area: An fMRI Study
2015
Instrumental music and language are both syntactic systems, employing complex, hierarchically-structured sequences built using implicit structural norms. This organization allows listeners to understand the role of individual words or tones in the context of an unfolding sentence or melody. Previous studies suggest that the brain mechanisms of syntactic processing may be partly shared between music and language. However, functional neuroimaging evidence for anatomical overlap of brain activity involved in linguistic and musical syntactic processing has been lacking. In the present study we used functional magnetic resonance imaging (fMRI) in conjunction with an interference paradigm based on sung sentences. We show that the processing demands of musical syntax (harmony) and language syntax interact in Broca's area in the left inferior frontal gyrus (without leading to music and language main effects). A language main effect in Broca's area only emerged in the complex music harmony condition, suggesting that (with our stimuli and tasks) a language effect only becomes visible under conditions of increased demands on shared neural resources. In contrast to previous studies, our design allows us to rule out that the observed neural interaction is due to: (1) general attention mechanisms, as a psychoacoustic auditory anomaly behaved unlike the harmonic manipulation, (2) error processing, as the language and the music stimuli contained no structural errors. The current results thus suggest that two different cognitive domains-music and language-might draw on the same high level syntactic integration resources in Broca's area.
Journal Article
Music, language, and the brain
by
Patel, Aniruddh D.
in
Auditory perception
,
Auditory perception -- Physiological aspects
,
Auditory Perception -- physiology
2010,2007
In the first comprehensive study of the relationship between music and language from the standpoint of cognitive neuroscience, Aniruddh D. Patel challenges the widespread belief that music and language are processed independently. This volume argues that music and language share deep and critical connections, and that comparative research provides a powerful way to study the cognitive and neural mechanisms underlying these uniquely human abilities.
Vocal learning as a preadaptation for the evolution of human beat perception and synchronization
2021
The human capacity to synchronize movements to an auditory beat is central to musical behaviour and to debates over the evolution of human musicality. Have humans evolved any neural specializations for music processing, or does music rely entirely on brain circuits that evolved for other reasons? The vocal learning and rhythmic synchronization hypothesis proposes that our ability to move in time with an auditory beat in a precise, predictive and tempo-flexible manner originated in the neural circuitry for complex vocal learning. In the 15 years since the hypothesis was proposed a variety of studies have supported it. However, one study has provided a significant challenge to the hypothesis. Furthermore, it is increasingly clear that vocal learning is not a binary trait animals have or lack, but varies more continuously across species. In the light of these developments and of recent progress in the neurobiology of beat processing and of vocal learning, the current paper revises the vocal learning hypothesis. It argues that an advanced form of vocal learning acts as a preadaptation for sporadic beat perception and synchronization (BPS), providing intrinsic rewards for predicting the temporal structure of complex acoustic sequences. It further proposes that in humans, mechanisms of gene-culture coevolution transformed this preadaptation into a genuine neural adaptation for sustained BPS. The larger significance of this proposal is that it outlines a hypothesis of cognitive gene-culture coevolution which makes testable predictions for neuroscience, cross-species studies and genetics.
This article is part of the theme issue 'Synchrony and rhythm interaction: from the brain to behavioural ecology'.
Journal Article
Memory in time: Neural tracking of low-frequency rhythm dynamically modulates memory formation
by
Race, Elizabeth
,
Patel, Aniruddh D.
,
Merseal, Hannah
in
Acoustic Stimulation - methods
,
Adolescent
,
Adult
2020
Time is a critical component of episodic memory. Yet it is currently unclear how different types of temporal signals are represented in the brain and how these temporal signals support episodic memory. The current study investigated whether temporal cues provided by low-frequency environmental rhythms influence memory formation. Specifically, we tested the hypothesis that neural tracking of low-frequency rhythm serves as a mechanism of selective attention that dynamically biases the encoding of visual information at specific moments in time. Participants incidentally encoded a series of visual objects while passively listening to background, instrumental music with a steady beat. Objects either appeared in-synchrony or out-of-synchrony with the background beat. Participants were then given a surprise subsequent memory test (in silence). Results revealed significant neural tracking of the musical beat at encoding, evident in increased electrophysiological power and inter-trial phase coherence at the perceived beat frequency (1.25 Hz). Importantly, enhanced neural tracking of the background rhythm at encoding was associated with superior subsequent memory for in-synchrony compared to out-of-synchrony objects at test. Together, these results provide novel evidence that the brain spontaneously tracks low-frequency musical rhythm during naturalistic listening situations, and that the strength of this neural tracking is associated with the effects of rhythm on higher-order cognitive processes such as episodic memory.
•Electrophysiological responses track musical rhythm during memory encoding.•Increased power and phase coherence occur at the musical beat frequency.•Neural tracking of background rhythm is associated with enhanced subsequent memory.•Stronger neural tracking linked to better memory for on-beat versus off-beat images.
Journal Article
Executive Function, Visual Attention and the Cocktail Party Problem in Musicians and Non-Musicians
2016
The goal of this study was to investigate how cognitive factors influence performance in a multi-talker, \"cocktail-party\" like environment in musicians and non-musicians. This was achieved by relating performance in a spatial hearing task to cognitive processing abilities assessed using measures of executive function (EF) and visual attention in musicians and non-musicians. For the spatial hearing task, a speech target was presented simultaneously with two intelligible speech maskers that were either colocated with the target (0° azimuth) or were symmetrically separated from the target in azimuth (at ±15°). EF assessment included measures of cognitive flexibility, inhibition control and auditory working memory. Selective attention was assessed in the visual domain using a multiple object tracking task (MOT). For the MOT task, the observers were required to track target dots (n = 1,2,3,4,5) in the presence of interfering distractor dots. Musicians performed significantly better than non-musicians in the spatial hearing task. For the EF measures, musicians showed better performance on measures of auditory working memory compared to non-musicians. Furthermore, across all individuals, a significant correlation was observed between performance on the spatial hearing task and measures of auditory working memory. This result suggests that individual differences in performance in a cocktail party-like environment may depend in part on cognitive factors such as auditory working memory. Performance in the MOT task did not differ between groups. However, across all individuals, a significant correlation was found between performance in the MOT and spatial hearing tasks. A stepwise multiple regression analysis revealed that musicianship and performance on the MOT task significantly predicted performance on the spatial hearing task. Overall, these findings confirm the relationship between musicianship and cognitive factors including domain-general selective attention and working memory in solving the \"cocktail party problem\".
Journal Article
Environmental rhythms orchestrate neural activity at multiple stages of processing during memory encoding: Evidence from event-related potentials
2020
Accumulating evidence suggests that rhythmic temporal structures in the environment influence memory formation. For example, stimuli that appear in synchrony with the beat of background, environmental rhythms are better remembered than stimuli that appear out-of-synchrony with the beat. This rhythmic modulation of memory has been linked to entrained neural oscillations which are proposed to act as a mechanism of selective attention that prioritize processing of events that coincide with the beat. However, it is currently unclear whether rhythm influences memory formation by influencing early (sensory) or late (post-perceptual) processing of stimuli. The current study used stimulus-locked event-related potentials (ERPs) to investigate the locus of stimulus processing at which rhythm temporal cues operate in the service of memory formation. Participants viewed a series of visual objects that either appeared in-synchrony or out-of-synchrony with the beat of background music and made a semantic classification (living/non-living) for each object. Participants’ memory for the objects was then tested (in silence). The timing of stimulus presentation during encoding (in-synchrony or out-of-synchrony with the background beat) influenced later ERPs associated with post-perceptual selection and orienting attention in time rather than earlier ERPs associated with sensory processing. The magnitude of post-perceptual ERPs also differed according to whether or not participants demonstrated a mnemonic benefit for in-synchrony compared to out-of-synchrony stimuli, and was related to the magnitude of the rhythmic modulation of memory performance across participants. These results support two prominent theories in the field, the Dynamic Attending Theory and the Oscillation Selection Hypothesis, which propose that neural responses to rhythm act as a core mechanism of selective attention that optimize processing at specific moments in time. Furthermore, they reveal that in addition to acting as a mechanism of early attentional selection, rhythm influences later, post-perceptual cognitive processes as events are transformed into memory.
Journal Article