Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
75
result(s) for
"Atkinson, Anthony P."
Sort by:
Foveal processing of emotion-informative facial features
2021
Certain facial features provide useful information for recognition of facial expressions. In two experiments, we investigated whether foveating informative features of briefly presented expressions improves recognition accuracy and whether these features are targeted reflexively when not foveated. Angry, fearful, surprised, and sad or disgusted expressions were presented briefly at locations which would ensure foveation of specific features. Foveating the mouth of fearful, surprised and disgusted expressions improved emotion recognition compared to foveating an eye or cheek or the central brow. Foveating the brow led to equivocal results in anger recognition across the two experiments, which might be due to the different combination of emotions used. There was no consistent evidence suggesting that reflexive first saccades targeted emotion-relevant features; instead, they targeted the closest feature to initial fixation. In a third experiment, angry, fearful, surprised and disgusted expressions were presented for 5 seconds. Duration of task-related fixations in the eyes, brow, nose and mouth regions was modulated by the presented expression. Moreover, longer fixation at the mouth positively correlated with anger and disgust accuracy both when these expressions were freely viewed (Experiment 2b) and when briefly presented at the mouth (Experiment 2a). Finally, an overall preference to fixate the mouth across all expressions correlated positively with anger and disgust accuracy. These findings suggest that foveal processing of informative features is functional/contributory to emotion recognition, but they are not automatically sought out when not foveated, and that facial emotion recognition performance is related to idiosyncratic gaze behaviour.
Journal Article
The neuropsychology of face perception: beyond simple dissociations and functional selectivity
2011
Face processing relies on a distributed, patchy network of cortical regions in the temporal and frontal lobes that respond disproportionately to face stimuli, other cortical regions that are not even primarily visual (such as somatosensory cortex), and subcortical structures such as the amygdala. Higher-level face perception abilities, such as judging identity, emotion and trustworthiness, appear to rely on an intact face-processing network that includes the occipital face area (OFA), whereas lower-level face categorization abilities, such as discriminating faces from objects, can be achieved without OFA, perhaps via the direct connections to the fusiform face area (FFA) from several extrastriate cortical areas. Some lesion, transcranial magnetic stimulation (TMS) and functional magnetic resonance imaging (fMRI) findings argue against a strict feed-forward hierarchical model of face perception, in which the OFA is the principal and common source of input for other visual and non-visual cortical regions involved in face perception, including the FFA, face-selective superior temporal sulcus and somatosensory cortex. Instead, these findings point to a more interactive model in which higher-level face perception abilities depend on the interplay between several functionally and anatomically distinct neural regions. Furthermore, the nature of these interactions may depend on the particular demands of the task. We review the lesion and TMS literature on this topic and highlight the dynamic and distributed nature of face processing.
Journal Article
Friendship habits questionnaire: A measure of group- versus dyadic-oriented socializing styles
by
Howlett, Philip
,
Baysu, Gülseli
,
Rychlowska, Magdalena
in
Analysis
,
Biology and Life Sciences
,
Competitiveness
2023
Friendships are central to our social lives, yet little is known about individual differences associated with the number of friends people enjoy spending time with. Here we present the Friendship Habits Questionnaire (FHQ), a new scale of group versus dyadic-oriented friendship styles. Three studies investigated the psychometric properties of group-oriented friendships and the relevant individual differences. The initially developed questionnaire measured individual differences in extraversion as well as desire for intimacy, competitiveness, and group identification, traits that previous research links with socializing in groups versus one-to-one friendships. In three validation studies involving more than 800 participants (353 men, age M = 25.76) and using principal and confirmatory factor analyses, we found that the structure of the FHQ is best described with four dimensions: extraversion, intimacy, positive group identification, and negative group identification. Therefore, competitiveness was dropped from the final version of the FHQ. Moreover, FHQ scores reliably predicted the size of friendship groups in which people enjoy socializing, suggesting good construct validity. Together, our results document individual differences in pursuing group versus dyadic-oriented friendships and provide a new tool for measuring such differences.
Journal Article
Modulation of the face- and body-selective visual regions by the motion and emotion of point-light face and body stimuli
by
Atkinson, Anthony P.
,
Smithson, Hannah E.
,
Vuong, Quoc C.
in
Adaptation, Physiological - physiology
,
Adult
,
Affect - physiology
2012
Neural regions selective for facial or bodily form also respond to facial or bodily motion in highly form-degraded point-light displays. Yet it is unknown whether these face-selective and body-selective regions are sensitive to human motion regardless of stimulus type (faces and bodies) or to the specific motion-related cues characteristic of their proprietary stimulus categories. Using fMRI, we show that facial and bodily motions activate selectively those populations of neurons that code for the static structure of faces and bodies. Bodily (vs. facial) motion activated body-selective EBA bilaterally and right but not left FBA, irrespective of whether observers judged the emotion or color-change in point-light angry, happy and neutral stimuli. Facial (vs. bodily) motion activated face-selective right and left FFA, but only during emotion judgments for right FFA. Moreover, the strength of responses to point-light bodies vs. faces positively correlated with voxelwise selectivity for static bodies but not faces, whereas the strength of responses to point-light faces positively correlated with voxelwise selectivity for static faces but not bodies. Emotional content carried by point-light form-from-motion cues was sufficient to enhance the activity of several regions, including bilateral EBA and right FFA and FBA. However, although the strength of emotional modulation in right and left EBA by point-light body movements was related to the degree of voxelwise selectivity to static bodies but not static faces, there was no evidence that emotional modulation in fusiform cortex occurred in a similarly stimulus category-selective manner. This latter finding strongly constrains the claim that emotionally expressive movements modulate precisely those neuronal populations that code for the viewed stimulus category.
► Point-light body vs. face motion activates body—but not face-selective regions. ► Point-light face vs. body motion activates left fusiform face area (FFA). ► Right FFA activation to point-light faces for emotion but not color judgments. ► Emotional modulation of body and face areas by point-light body but not face motion.
Journal Article
Emotional modulation of body-selective visual areas
by
Peelen, Marius V.
,
Vuilleumier, Patrik
,
Atkinson, Anthony P.
in
Affect
,
Amygdala - physiology
,
Body Image
2007
Emotionally expressive faces have been shown to modulate activation in visual cortex, including face-selective regions in ventral temporal lobe. Here, we tested whether emotionally expressive bodies similarly modulate activation in body-selective regions. We show that dynamic displays of bodies with various emotional expressions vs neutral bodies, produce significant activation in two distinct body-selective visual areas, the extrastriate body area and the fusiform body area. Multi-voxel pattern analysis showed that the strength of this emotional modulation was related, on a voxel-by-voxel basis, to the degree of body selectivity, while there was no relation with the degree of selectivity for faces. Across subjects, amygdala responses to emotional bodies positively correlated with the modulation of body-selective areas. Together, these results suggest that emotional cues from body movements produce topographically selective influences on category-specific populations of neurons in visual cortex, and these increases may implicate discrete modulatory projections from the amygdala.
Journal Article
The development of visually guided stepping
2019
Adults use vision during stepping and walking to fine-tune foot placement. However, the developmental profile of visually guided stepping is unclear. We asked (1) whether children use online vision to fine-tune precise steps and (2) whether precision stepping develops as part of broader visuomotor development, alongside other fundamental motor skills like reaching. With 6-(N = 11), 7-(N = 11), 8-(N = 11)-year-olds and adults (N = 15), we manipulated visual input during steps and reaches. Using motion capture, we measured step and reach error, and postural stability. We expected (1) both steps and reaches would be visually guided (2) with similar developmental profiles (3) foot placement biases that promote stability, and (4) correlations between postural stability and step error. Children used vision to fine-tune both steps and reaches. At all ages, foot placement was biased (albeit not in the predicted directions). Contrary to our predictions, step error was not correlated with postural stability. By 8 years, children’s step and reach error were adult-like. Despite similar visual control mechanisms, stepping and reaching had different developmental profiles: step error reduced with age whilst reach error was lower and stable with age. We argue that the development of both visually guided and non-visually guided action is limb-specific.
Journal Article
Dissociable Processing of Emotional and Neutral Body Movements Revealed by μ-Alpha and Beta Rhythms
2018
Both when actions are executed and observed, electroencephalography (EEG) has shown reduced alpha-band (8-12 Hz) oscillations over sensorimotor cortex. This 'μ-alpha' suppression is thought to reflect mental simulation of action, which has been argued to support internal representation of others' emotional states. Despite the proposed role of simulation in emotion perception, little is known about the effect of emotional content on μ-suppression. We recorded high-density EEG while participants viewed point-light displays of emotional vs neutral body movements in 'coherent' biologically plausible and 'scrambled' configurations. Although coherent relative to scrambled stimuli elicited μ-alpha suppression, the comparison of emotional and neutral movement, controlling for basic visual input, revealed suppression effects in both alpha and beta bands. Whereas alpha-band activity reflected reduced power for emotional stimuli in central and occipital sensors, beta power at frontocentral sites was driven by enhancement for neutral relative to emotional actions. A median-split by autism-spectrum quotient score revealed weaker μ-alpha suppression and beta enhancement in participants with autistic tendencies, suggesting that sensorimotor simulation may be differentially engaged depending on social capabilities. Consistent with theories of embodied emotion, these data support a link between simulation and social perception while more firmly connecting emotional processing to the activity of sensorimotor systems.
Journal Article
Discrimination of fearful and happy body postures in 8-month-old infants: an event-related potential study
by
Missana, Manuela
,
Atkinson, Anthony P.
,
Rajhans, Purva
in
Babies
,
body expressions
,
Brain research
2014
Responding to others' emotional body expressions is an essential social skill in humans. Adults readily detect emotions from body postures, but it is unclear whether infants are sensitive to emotional body postures. We examined 8-month-old infants' brain responses to emotional body postures by measuring event-related potentials (ERPs) to happy and fearful bodies. Our results revealed two emotion-sensitive ERP components: body postures evoked an early N290 at occipital electrodes and a later Nc at fronto-central electrodes that were enhanced in response to fearful (relative to happy) expressions. These findings demonstrate that: (a) 8-month-old infants discriminate between static emotional body postures; and (b) similar to infant emotional face perception, the sensitivity to emotional body postures is reflected in early perceptual (N290) and later attentional (Nc) neural processes. This provides evidence for an early developmental emergence of the neural processes involved in the discrimination of emotional body postures.
Journal Article
Dissociable processing of emotional and neutral body movements revealed by mu-alpha and beta rhythms
2018
Both when actions are executed and observed, electroencephalography (EEG) has shown reduced alpha-band (8-12 Hz) oscillations over sensorimotor cortex. This '[mu]-alpha' suppression is thought to reflect mental simulation of action, which has been argued to support internal representation of others' emotional states. Despite the proposed role of simulation in emotion perception, little is known about the effect of emotional content on [mu]-suppression. We recorded high-density EEG while participants viewed point-light displays of emotional vs neutral body movements in 'coherent' biologically plausible and 'scrambled' configurations. Although coherent relative to scrambled stimuli elicited [mu]-alpha suppression, the comparison of emotional and neutral movement, controlling for basic visual input, revealed suppression effects in both alpha and beta bands. Whereas alpha-band activity reflected reduced power for emotional stimuli in central and occipital sensors, beta power at frontocentral sites was driven by enhancement for neutral relative to emotional actions. A median-split by autism-spectrum quotient score revealed weaker [mu]-alpha suppression and beta enhancement in participants with autistic tendencies, suggesting that sensorimotor simulation may be differentially engaged depending on social capabilities. Consistent with theories of embodied emotion, these data support a link between simulation and social perception while more firmly connecting emotional processing to the activity of sensorimotor systems.
Journal Article