Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
17 result(s) for "Jentschke, Sebastian"
Sort by:
Processing of hierarchical syntactic structure in music
Hierarchical structure with nested nonlocal dependencies is a key feature of human language and can be identified theoretically in most pieces of tonal music. However, previous studies have argued against the perception of such structures in music. Here, we show processing of nonlocal dependencies in music. We presented chorales by J. S. Bach and modified versions in which the hierarchical structure was rendered irregular whereas the local structure was kept intact. Brain electric responses differed between regular and irregular hierarchical structures, in both musicians and nonmusicians. This finding indicates that, when listening to music, humans apply cognitive processes that are capable of dealing with long-distance dependencies resulting from hierarchically organized syntactic structures. Our results reveal that a brain mechanism fundamental for syntactic processing is engaged during the perception of music, indicating that processing of hierarchical structure with nested nonlocal dependencies is not just a key component of human language, but a multidomain capacity of human cognition.
When the statistical MMN meets the physical MMN
How do listeners respond to prediction errors within patterned sequence of sounds? To answer this question we carried out a statistical learning study using electroencephalography (EEG). In a continuous auditory stream of sound triplets the deviations were either (a) statistical, in terms of transitional probability, (b) physical, due to a change in sound location (left or right speaker) or (c) a double deviants, i.e. a combination of the two. Statistical and physical deviants elicited a statistical mismatch negativity and a physical MMN respectively. Most importantly, we found that effects of statistical and physical deviants interacted (the statistical MMN was smaller when co-occurring with a physical deviant). Results show, for the first time, that processing of prediction errors due to statistical learning is affected by prediction errors due to physical deviance. Our findings thus show that the statistical MMN interacts with the physical MMN, implying that prediction error processing due to physical sound attributes suppresses processing of learned statistical properties of sounds.
Musical training modulates the development of syntax processing in children
The question of how musical training can influence perceptual and cognitive abilities of children has been the subject of numerous past studies. However, evidence showing which neural mechanisms underlie changes in cognitive skills in another domain following musical training has remained sparse. Syntax processing in language and music has been shown to rely on overlapping neural resources, and this study compared the neural correlates of language- and music-syntactic processing between children with and without long-term musical training. Musically trained children had larger amplitudes of the ERAN (early right anterior negativity), elicited by music-syntactic irregularities. Furthermore, the ELAN (early left anterior negativity), a neurophysiological marker of syntax processing in language, was more strongly developed in these children, and they furthermore had an enlarged amplitude of a later negativity, assumed to reflect more sustained syntax processing. Thus, our data suggest that the neurophysiological mechanisms underlying syntax processing in music and language are developed earlier, and more strongly, in children with musical training.
Unpredictability of the “when” influences prediction error processing of the “what” and “where”
The capability to establish accurate predictions is an integral part of learning. Whether predictions about different dimensions of a stimulus interact with each other, and whether such an interaction affects learning, has remained elusive. We conducted a statistical learning study with EEG (electroencephalography), where a stream of consecutive sound triplets was presented with deviants that were either: (a) statistical, depending on the triplet ending probability, (b) physical, due to a change in sound location or (c) double deviants, i.e. a combination of the two. We manipulated the predictability of stimulus-onset by using random stimulus-onset asynchronies. Temporal unpredictability due to random onsets reduced the neurophysiological responses to statistical and location deviants, as indexed by the statistical mismatch negativity (sMMN) and the location MMN. Our results demonstrate that the predictability of one stimulus attribute influences the processing of prediction error signals of other stimulus attributes, and thus also learning of those attributes.
Neocortical substrates of feelings evoked with music in the ACC, insula, and somatosensory cortex
Neurobiological models of emotion focus traditionally on limbic/paralimbic regions as neural substrates of emotion generation, and insular cortex (in conjunction with isocortical anterior cingulate cortex, ACC) as the neural substrate of feelings. An emerging view, however, highlights the importance of isocortical regions beyond insula and ACC for the subjective feeling of emotions. We used music to evoke feelings of joy and fear, and multivariate pattern analysis (MVPA) to decode representations of feeling states in functional magnetic resonance (fMRI) data of n  = 24 participants. Most of the brain regions providing information about feeling representations were neocortical regions. These included, in addition to granular insula and cingulate cortex, primary and secondary somatosensory cortex, premotor cortex, frontal operculum, and auditory cortex. The multivoxel activity patterns corresponding to feeling representations emerged within a few seconds, gained in strength with increasing stimulus duration, and replicated results of a hypothesis-generating decoding analysis from an independent experiment. Our results indicate that several neocortical regions (including insula, cingulate, somatosensory and premotor cortices) are important for the generation and modulation of feeling states. We propose that secondary somatosensory cortex, which covers the parietal operculum and encroaches on the posterior insula, is of particular importance for the encoding of emotion percepts , i.e., preverbal representations of subjective feeling.
Under the hood of statistical learning: A statistical MMN reflects the magnitude of transitional probabilities in auditory sequences
Within the framework of statistical learning, many behavioural studies investigated the processing of unpredicted events. However, surprisingly few neurophysiological studies are available on this topic and no statistical learning experiment has investigated electroencephalographic (EEG) correlates of processing events with different transition probabilities. We carried out an EEG study with a novel variant of the established statistical learning paradigm. Timbres were presented in isochronous sequences of triplets. The first two sounds of all triplets were equiprobable, while the third sound occurred with either low (10%), intermediate (30%), or high (60%) probability. Thus, the occurrence probability of the third item of each triplet (given the first two items) was varied. Compared to high-probability triplet endings, endings with low and intermediate probability elicited an early anterior negativity that had an onset around 100 ms and was maximal at around 180 ms. This effect was larger for events with low than for events with intermediate probability. Our results reveal that, when predictions are based on statistical learning, events that do not match a prediction evoke an early anterior negativity, with the amplitude of this mismatch response being inversely related to the probability of such events. Thus, we report a statistical mismatch negativity (sMMN) that reflects statistical learning of transitional probability distributions that go beyond auditory sensory memory capabilities.
Effects of Aesthetic Chills on a Cardiac Signature of Emotionality
Previous studies have shown that a cardiac signature of emotionality (referred to as EK, which can be computed from the standard 12 lead electrocardiogram, ECG), predicts inter-individual differences in the tendency to experience and express positive emotion. Here, we investigated whether EK values can be transiently modulated during stimulation with participant-selected music pieces and film scenes that elicit strongly positive emotion. The phenomenon of aesthetic chills, as indicated by measurable piloerection on the forearm, was used to accurately locate moments of peak emotional responses during stimulation. From 58 healthy participants, continuous EK values, heart rate, and respiratory frequency were recorded during stimulation with film scenes and music pieces, and were related to the aesthetic chills. EK values, as well as heart rate, increased significantly during moments of peak positive emotion accompanied by piloerection. These results are the first to provide evidence for an influence of momentary psychological state on a cardiac signature of emotional personality (as reflected in EK values). The possibility to modulate ECG amplitude signatures via stimulation with emotionally significant music pieces and film scenes opens up new perspectives for the use of emotional peak experiences in the therapy of disorders characterized by flattened emotionality, such as depression or schizoid personality disorder.
Neural Correlates of Emotional Personality: A Structural and Functional Magnetic Resonance Imaging Study
Studies addressing brain correlates of emotional personality have remained sparse, despite the involvement of emotional personality in health and well-being. This study investigates structural and functional brain correlates of psychological and physiological measures related to emotional personality. Psychological measures included neuroticism, extraversion, and agreeableness scores, as assessed using a standard personality questionnaire. As a physiological measure we used a cardiac amplitude signature, the so-called E κ value (computed from the electrocardiogram) which has previously been related to tender emotionality. Questionnaire scores and E κ values were related to both functional (eigenvector centrality mapping, ECM) and structural (voxel-based morphometry, VBM) neuroimaging data. Functional magnetic resonance imaging (fMRI) data were obtained from 22 individuals (12 females) while listening to music (joy, fear, or neutral music). ECM results showed that agreeableness scores correlated with centrality values in the dorsolateral prefrontal cortex, the anterior cingulate cortex, and the ventral striatum (nucleus accumbens). Individuals with higher E κ values (indexing higher tender emotionality) showed higher centrality values in the subiculum of the right hippocampal formation. Structural MRI data from an independent sample of 59 individuals (34 females) showed that neuroticism scores correlated with volume of the left amygdaloid complex. In addition, individuals with higher E κ showed larger gray matter volume in the same portion of the subiculum in which individuals with higher E κ showed higher centrality values. Our results highlight a role of the amygdala in neuroticism. Moreover, they indicate that a cardiac signature related to emotionality (E κ) correlates with both function (increased network centrality) and structure (grey matter volume) of the subiculum of the hippocampal formation, suggesting a role of the hippocampal formation for emotional personality. Results are the first to show personality-related differences using eigenvector centrality mapping, and the first to show structural brain differences for a physiological measure associated with personality.
Music Perception Influences Language Acquisition: Melodic and Rhythmic-Melodic Perception in Children with Specific Language Impairment
Language and music share many properties, with a particularly strong overlap for prosody. Prosodic cues are generally regarded as crucial for language acquisition. Previous research has indicated that children with SLI fail to make use of these cues. As processing of prosodic information involves similar skills to those required in music perception, we compared music perception skills (melodic and rhythmic-melodic perception and melody recognition) in a group of children with SLI (N=29, five-year-olds) to two groups of controls, either of comparable age (N=39, five-year-olds) or of age closer to the children with SLI in their language skills and about one year younger (N=13, four-year-olds). Children with SLI performed in most tasks below their age level, closer matching the performance level of younger controls with similar language skills. These data strengthen the view of a strong relation between language acquisition and music processing. This might open a perspective for the possible use of musical material in early diagnosis of SLI and of music in SLI therapy.
Cardiac Signatures of Personality
There are well-established relations between personality and the heart, as evidenced by associations between negative emotions on the one hand, and coronary heart disease or chronic heart failure on the other. However, there are substantial gaps in our knowledge about relations between the heart and personality in healthy individuals. Here, we investigated whether amplitude patterns of the electrocardiogram (ECG) correlate with neurotisicm, extraversion, agreeableness, warmth, positive emotion, and tender-mindedness as measured with the Neuroticism-Extraversion-Openness (NEO) personality inventory. Specifically, we investigated (a) whether a cardiac amplitude measure that was previously reported to be related to flattened affectivity (referred to as Eκ values) would explain variance of NEO scores, and (b) whether correlations can be found between NEO scores and amplitudes of the ECG. NEO scores and rest ECGs were obtained from 425 healthy individuals. Neuroticism and positive emotion significantly differed between individuals with high and low Eκ values. In addition, stepwise cross-validated regressions indicated correlations between ECG amplitudes and (a) agreeableness, as well as (b) positive emotion. These results are the first to demonstrate that ECG amplitude patterns provide information about the personality of an individual as measured with NEO personality scales and facets. These findings open new perspectives for a more efficient personality assessment using cardiac measures, as well as for more efficient risk-stratification and pre-clinical diagnosis of individuals at risk for cardiac, affective and psychosomatic disorders.