Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Reading LevelReading Level
-
Content TypeContent Type
-
YearFrom:-To:
-
More FiltersMore FiltersItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
109
result(s) for
"Keltner, Dacher"
Sort by:
Self-report captures 27 distinct categories of emotion bridged by continuous gradients
2017
Emotions are centered in subjective experiences that people represent, in part, with hundreds, if not thousands, of semantic terms. Claims about the distribution of reported emotional states and the boundaries between emotion categories—that is, the geometric organization of the semantic space of emotion—have sparked intense debate. Here we introduce a conceptual framework to analyze reported emotional states elicited by 2,185 short videos, examining the richest array of reported emotional experiences studied to date and the extent to which reported experiences of emotion are structured by discrete and dimensional geometries. Across self-report methods, we find that the videos reliably elicit 27 distinct varieties of reported emotional experience. Further analyses revealed that categorical labels such as amusement better capture reports of subjective experience than commonly measured affective dimensions (e.g., valence and arousal). Although reported emotional experiences are represented within a semantic space best captured by categorical labels, the boundaries between categories of emotion are fuzzy rather than discrete. By analyzing the distribution of reported emotional states we uncover gradients of emotion—from anxiety to fear to horror to disgust, calmness to aesthetic appreciation to awe, and others—that correspond to smooth variation in affective dimensions such as valence and dominance. Reported emotional states occupy a complex, high-dimensional categorical space. In addition, our library of videos and an interactive map of the emotional states they elicit (https://s3-us-west-1.amazonaws.com/emogifs/map.html) are made available to advance the science of emotion.
Journal Article
The compassionate instinct : the science of human goodness
Leading scientists and science writers reflect on the life-changing, perspective-changing, new science of human goodness.
Emotional Expression: Advances in Basic Emotion Theory
by
Sauter, Disa
,
Keltner, Dacher
,
Cowen, Alan
in
Acknowledgment
,
Behavioral Science and Psychology
,
Emotion recognition
2019
In this article, we review recent developments in the study of emotional expression within a basic emotion framework. Dozens of new studies find that upwards of 20 emotions are signaled in multimodal and dynamic patterns of expressive behavior. Moving beyond word to stimulus matching paradigms, new studies are detailing the more nuanced and complex processes involved in emotion recognition and the structure of how people perceive emotional expression. Finally, we consider new studies documenting contextual influences upon emotion recognition. We conclude by extending these recent findings to questions about emotion-related physiology and the mammalian precursors of human emotion.
Journal Article
Mapping the Passions
2019
What would a comprehensive atlas of human emotions include? For 50 years, scientists have sought to map emotionrelated experience, expression, physiology, and recognition in terms of the “basic six”—anger, disgust, fear, happiness, sadness, and surprise. Claims about the relationships between these six emotions and prototypical facial configurations have provided the basis for a long-standing debate over the diagnostic value of expression (for review and latest installment in this debate, see Barrett et al., p. 1). Building on recent empirical findings and methodologies, we offer an alternative conceptual and methodological approach that reveals a richer taxonomy of emotion. Dozens of distinct varieties of emotion are reliably distinguished by language, evoked in distinct circumstances, and perceived in distinct expressions of the face, body, and voice. Traditional models—both the basic six and affective-circumplex model (valence and arousal)—capture a fraction of the systematic variability in emotional response. In contrast, emotionrelated responses (e.g., the smile of embarrassment, triumphant postures, sympathetic vocalizations, blends of distinct expressions) can be explained by richer models of emotion. Given these developments, we discuss why tests of a basic-six model of emotion are not tests of the diagnostic value of facial expression more generally. Determining the full extent of what facial expressions can tell us, marginally and in conjunction with other behavioral and contextual cues, will require mapping the high-dimensional, continuous space of facial, bodily, and vocal signals onto richly multifaceted experiences using large-scale statistical modeling and machine-learning methods.
Journal Article
Sixteen facial expressions occur in similar contexts worldwide
2021
Understanding the degree to which human facial expressions co-vary with specific social contexts across cultures is central to the theory that emotions enable adaptive responses to important challenges and opportunities
1
–
6
. Concrete evidence linking social context to specific facial expressions is sparse and is largely based on survey-based approaches, which are often constrained by language and small sample sizes
7
–
13
. Here, by applying machine-learning methods to real-world, dynamic behaviour, we ascertain whether naturalistic social contexts (for example, weddings or sporting competitions) are associated with specific facial expressions
14
across different cultures. In two experiments using deep neural networks, we examined the extent to which 16 types of facial expression occurred systematically in thousands of contexts in 6 million videos from 144 countries. We found that each kind of facial expression had distinct associations with a set of contexts that were 70% preserved across 12 world regions. Consistent with these associations, regions varied in how frequently different facial expressions were produced as a function of which contexts were most salient. Our results reveal fine-grained patterns in human facial expressions that are preserved across the modern world.
An analysis of 16 types of facial expression in thousands of contexts in millions of videos revealed fine-grained patterns in human facial expression that are preserved across the modern world.
Journal Article
Higher social class predicts increased unethical behavior
2012
Seven studies using experimental and naturalistic methods reveal that upper-class individuals behave more unethically than lower-class individuals. In studies 1 and 2, upper-class individuals were more likely to break the law while driving, relative to lower-class individuals. In follow-up laboratory studies, upper-class individuals were more likely to exhibit unethical decision-making tendencies (study 3), take valued goods from others (study 4), lie in a negotiation (study 5), cheat to increase their chances of winning a prize (study 6), and endorse unethical behavior at work (study 7) than were lower-class individuals. Mediator and moderator data demonstrated that upper-class individuals’ unethical tendencies are accounted for, in part, by their more favorable attitudes toward greed.
Journal Article
Social Class, Contextualism, and Empathic Accuracy
2010
Recent research suggests that lower-class individuals favor explanations of personal and political outcomes that are oriented to features of the external environment. We extended this work by testing the hypothesis that, as a result, individuals of a lower social class are more empathically accurate in judging the emotions of other people. In three studies, lower-class individuals (compared with upper-class individuals) received higher scores on a test of empathic accuracy (Study 1), judged the emotions of an interaction partner more accurately (Study 2), and made more accurate inferences about emotion from static images of muscle movements in the eyes (Study 3). Moreover, the association between social class and empathie accuracy was explained by the tendency for lower-class individuals to explain social events in terms of features of the external environment. The implications of class-based patterns in empathic accuracy for well-being and relationship outcomes are discussed.
Journal Article
Income Inequality and White-on-Black Racial Bias in the United States
2019
Several theories predict that income inequality may produce increased racial bias, but robust tests of this hypothesis are lacking. We examined this relationship at the U.S. state level from 2004 to 2015 using Internal Revenue Service–based income-inequality statistics and two large-scale racial-bias data sources: Project Implicit (N = 1,554,109) and Google Trends. Using a multimethod approach, we found evidence of a significant positive within-state association between income inequality and Whites’ explicit racial bias. However, the effect was small, with income inequality accounting for 0.4% to 0.7% of within-state variation in racial bias, and was also contingent on model specification, with results dependent on the measure of income inequality used. We found no conclusive evidence linking income inequality to implicit racial bias or racially offensive Google searches. Overall, our findings admit multiple interpretations, but we discuss why statistically small effects of income inequality on explicit racial bias may nonetheless be socially meaningful.
Journal Article
The primacy of categories in the recognition of 12 emotions in speech prosody across two cultures
by
Cowen, Alan S.
,
Elfenbein, Hillary Anger
,
Keltner, Dacher
in
4014/477/2811
,
631/114/2397
,
Acknowledgment
2019
Central to emotion science is the degree to which categories, such as Awe, or broader affective features, such as Valence, underlie the recognition of emotional expression. To explore the processes by which people recognize emotion from prosody, US and Indian participants were asked to judge the emotion categories or affective features communicated by 2,519 speech samples produced by 100 actors from 5 cultures. With large-scale statistical inference methods, we find that prosody can communicate at least 12 distinct kinds of emotion that are preserved across the 2 cultures. Analyses of the semantic and acoustic structure of the recognition of emotions reveal that emotion categories drive the recognition of emotions more so than affective features, including Valence. In contrast to discrete emotion theories, however, emotion categories are bridged by gradients representing blends of emotions. Our findings, visualized within an interactive map, reveal a complex, high-dimensional space of emotional states recognized cross-culturally in speech prosody.
Whether emotions are universal across cultures is a central question in psychological research. In this study, Cowen et al. show that speech prosody can communicate at least 12 emotions that are recognized across two different cultures.
Journal Article