Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
207 result(s) for "de Gelder, Beatrice"
Sort by:
Why bodies? Twelve reasons for including bodily expressions in affective neuroscience
Why then have whole bodies and bodily expressions not attracted the attention of researchers so far? The goal of this article is to contribute some elements for an answer to this question. I believe that there is something to learn from the historical neglect of bodies and bodily expressions. I will next address some historical misconceptions about whole-body perception, and in the process I intend not only to provide an impetus for this kind of work but also to contribute to a better understanding of the significance of the affective dimension of behaviour, mind and brain as seen from the vantage point of bodily communication. Subsequent sections discuss available evidence for the neurofunctional basis of facial and bodily expressions as well as neuropsychological and clinical studies of bodily expressions.
Neural bases of the non-conscious perception of emotional signals
Key Points Many emotional signals are processed without being consciously perceived. Non-conscious perception of emotional stimuli is present in both healthy observers, as a consequence of experimental manipulation, and in neurological conditions resulting from focal brain damage, such as hemispatial neglect and cortical blindness. An emotional stimulus can be perceived non-consciously because it falls outside the focus of attention (a phenomenon referred to as attentional unawareness) or because its sensory analysis is hampered (a phenomenon referred to as sensory unawareness). Although both phenomena render the observer unaware of the stimulus, they involve different neural processes. Non-conscious perception of emotional stimuli involves a neural system that is composed of subcortical structures, such as the superior colliculus, the visual pulvinar and the amygdala. This neural system receives visual information directly from the retina — thus bypassing the visual cortex — and has an old evolutionary origin, being present in other species like birds, rats and monkeys. The function of this subcortical system is to provide a rapid, but coarse, analysis of the visual stimuli in order to provide reflex-like responses to emotional signals in the environment. Neurophysiological changes and expressive reactions associated with non-conscious perception of emotional stimuli are consistently more rapid and more intense than responses associated with conscious perception of the same stimuli. The subcortical system for emotion processing influences cortical activity in several direct and indirect ways. The extent of this cortico–subcortical integration is a crucial factor in affecting visual awareness. Emotional stimuli, such as a fear-expressing face, can be processed without being consciously perceived and can influence behaviour. Tamietto and de Gelder describe the subcortical pathway that processes such stimuli, and discuss whether subcortical versus cortical processing of stimuli translate into non-conscious versus conscious perception. An interview with Beatrice de Gelder for Neuropod is available for download . Many emotional stimuli are processed without being consciously perceived. Recent evidence indicates that subcortical structures have a substantial role in this processing. These structures are part of a phylogenetically ancient pathway that has specific functional properties and that interacts with cortical processes. There is now increasing evidence that non-consciously perceived emotional stimuli induce distinct neurophysiological changes and influence behaviour towards the consciously perceived world. Understanding the neural bases of the non-conscious perception of emotional signals will clarify the phylogenetic continuity of emotion systems across species and the integration of cortical and subcortical activity in the human brain.
Impaired face and body perception in developmental prosopagnosia
Prosopagnosia is a deficit in face recognition in the presence of relatively normal object recognition. Together with older lesion studies, recent brain-imaging results provide evidence for the closely related representations of faces and objects and, more recently, for brain areas sensitive to faces and bodies. This evidence raises the issue of whether developmental prosopagnosics may also have an impairment in encoding bodies. We investigated the first stages of face, body, and object perception in four developmental prosopagnosics by comparing event-related potentials to canonically and upside-down presented stimuli. Normal configural encoding was absent in three of four developmental prosopagnosics for faces at the P1 and for both faces and bodies at the N170 component. Our results demonstrate that prosopagnosics do not have this normal processing routine readily available for faces or bodies. A profound face recognition deficit characteristic of developmental prosopagnosia may not necessarily originate in a category-specific face recognition deficit in the initial stages of development. It may also have its roots in anomalous processing of the configuration, a visual routine that is important for other stimuli besides faces. Faces and bodies trigger configuration-based visual strategies that are crucial in initial stages of stimulus encoding but also serve to bootstrap the acquisition of more feature-based visual skills that progressively build up in the course of development.
From Empathy to Apathy: The Bystander Effect Revisited
The bystander effect, the reduction in helping behavior in the presence of other people, has been explained predominantly by situational influences on decision making. Diverging from this view, we highlight recent evidence on the neural mechanisms and dispositional factors that determine apathy in bystanders. We put forward a new theoretical perspective that integrates emotional, motivational, and dispositional aspects. In the presence of other bystanders, personal distress is enhanced, and fixed action patterns of avoidance and freezing dominate. This new perspective suggests that bystander apathy results from a reflexive emotional reaction dependent on the personality of the bystander.
Towards the neurobiology of emotional body language
Emotional body language is a rapidly emerging research field in cognitive neuroscience. de Gelder reviews the body's role in our understanding of emotion, action and communication, and discusses similarities in the neuroanatomy and temporal dynamics between face and body perception. People's faces show fear in many different circumstances. However, when people are terrified, as well as showing emotion, they run for cover. When we see a bodily expression of emotion, we immediately know what specific action is associated with a particular emotion, leaving little need for interpretation of the signal, as is the case for facial expressions. Research on emotional body language is rapidly emerging as a new field in cognitive and affective neuroscience. This article reviews how whole-body signals are automatically perceived and understood, and their role in emotional communication and decision-making.
Face identity matching is influenced by emotions conveyed by face and body
Faces provide information about multiple characteristics like personal identity and emotion. Classical models of face perception postulate separate sub-systems for identity and expression recognition but recent studies have documented emotional contextual influences on recognition of faces. The present study reports three experiments where participants were presented realistic face-body compounds in a 2 category (face and body) × 2 emotion (neutral and fearful) factorial design. The task always consisted of two-alternative forced choice facial identity matching. The results show that during simultaneous face identity matching, the task irrelevant bodily expressions influence processing of facial identity, under conditions of unlimited viewing (Experiment 1) as well as during brief (750 ms) presentation (Experiment 2). In addition, delayed (5000 ms) face identity matching of rapidly (150 ms) presented face-body compounds, was also influenced by the body expression (Experiment 3). The results indicate that face identity perception mechanisms interact with processing of bodily and facial expressions.
The Influence of Conscious and Unconscious Body Threat Expressions on Motor Evoked Potentials Studied With Continuous Flash Suppression
The observation of threatening expression in others is a strong cue for triggering an action response. One method of capturing such action responses is by measuring the amplitude of motor evoked potentials (MEPs) elicited with single pulse TMS over the primary motor cortex. Indeed, it has been shown that viewing whole body expressions of threat modulate the size of MEP amplitude. Furthermore, emotional cues have been shown to act on certain brain areas even outside of conscious awareness. In the current study, we explored if the influence of viewing whole body expressions of threat extends to stimuli that are presented outside of conscious awareness in healthy participants. To accomplish this, we combined the measurement of MEPs with a continuous flash suppression task. In experiment 1, participants were presented with images of neutral bodies, fearful bodies, or objects that were either perceived consciously or unconsciously, while single pulses of TMS were applied at different times after stimulus onset (200, 500, or 700 ms). In experiment 2 stimuli consisted of neutral bodies, angry bodies or objects, and pulses were applied at either 200 or 400 ms post stimulus onset. In experiment 1, there was a general effect of the time of stimulation, but no condition specific effects were evident. In experiment 2 there were no significant main effects, nor any significant interactions. Future studies need to look into earlier effects of MEP modulation by emotion body stimuli, specifically when presented outside of conscious awareness, as well as an exploration of other outcome measures such as intracortical facilitation.
Rapid influence of emotional scenes on encoding of facial expressions: an ERP study
In daily life, we perceive a person's facial reaction as part of the natural environment surrounding it. Because most studies have investigated how facial expressions are recognized by using isolated faces, it is unclear what role the context plays. Although it has been observed that the N170 for facial expressions is modulated by the emotional context, it was not clear whether individuals use context information on this stage of processing to discriminate between facial expressions. The aim of the present study was to investigate how the early stages of face processing are affected by emotional scenes when explicit categorizations of fearful and happy facial expressions are made. Emotion effects were found for the N170, with larger amplitudes for faces in fearful scenes as compared to faces in happy and neutral scenes. Critically, N170 amplitudes were significantly increased for fearful faces in fearful scenes as compared to fearful faces in happy scenes and expressed in left-occipito-temporal scalp topography differences. Our results show that the information provided by the facial expression is combined with the scene context during the early stages of face processing.
The role of computational and subjective features in emotional body expressions
Humans are experts at recognizing intent and emotion from other people’s body movements; however, the underlying mechanisms are poorly understood. Here, we computed quantitative features of body posture and kinematics and acquired behavioural ratings of these feature descriptors to investigate their role in affective whole-body movement perception. Representational similarity analyses and classification regression trees were used to investigate the relation of emotional categories to both the computed features and behavioural ratings. Overall, postural rather than kinematic features discriminated better between emotional movements for the computed as well as for the behavioural features. In particular, limb angles and symmetry appeared to be the most relevant ones. This was observed independently of whether or not the time-related information was preserved in the computed features. Interestingly, the behavioural ratings showed a clearer distinction between affective movements than the computed counterparts. Finally, the perceived directionality of the movement (i.e. towards or away from the observer) was found to be critical for the recognition of fear and anger.
The representation and plasticity of body emotion expression
Emotions are expressed by the face, the voice and the whole body. Research on the face and the voice has not only demonstrated that emotions are perceived categorically, but that this perception can be manipulated. The purpose of this study was to investigate, via two separate experiments using adaptation and multisensory techniques, whether the perception of body emotion expressions also shows categorical effects and plasticity. We used an approach developed for studies investigating both face and voice emotion perception and created novel morphed affective body stimuli, which varied in small incremental steps between emotions. Participants were instructed to perform an emotion categorisation of these morphed bodies after adaptation to bodies conveying different expressions (Experiment 1), or while simultaneously hearing affective voices (Experiment 2). We show that not only is body expression perceived categorically, but that both adaptation to affective body expressions and concurrent presentation of vocal affective information can shift the categorical boundary between body expressions, specifically for the angry body expressions. Overall, our findings provide significant new insights into emotional body categorisation, which may prove important in gaining a deeper understanding of body expression perception in everyday social situations.