Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Language
      Language
      Clear All
      Language
  • Subject
      Subject
      Clear All
      Subject
  • Item Type
      Item Type
      Clear All
      Item Type
  • Discipline
      Discipline
      Clear All
      Discipline
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
693 result(s) for "Wataru Sato"
Sort by:
Advancements in Sensors and Analyses for Emotion Sensing
Exploring the objective signals associated with subjective emotional states has practical significance [...].Exploring the objective signals associated with subjective emotional states has practical significance [...].
Ear thermal imaging for emotion sensing
Thermal imaging, recognized for its non-contact and non-invasive properties, has been extensively used to investigate the physiological effects of emotions. While previous research has linked facial temperature changes to emotional arousal, the relationship between ear temperature and emotion remains unexplored. In this study, we acquired ear thermal imaging data and dynamic emotional ratings from 15 participants watching emotion-eliciting videos. Pixel-wise analysis revealed a negative correlation between ear temperature and emotional arousal across broad outer ear regions, including the antihelical fold, antihelix, and earlobe. These findings established ear temperature as a novel physiological marker of emotional arousal, providing new insights into thermophysiological responses to emotions. This breakthrough has important implications for affective computing, mental health monitoring, and real-time emotion recognition, expanding the potential applications of thermal imaging in emotion assessment.
Exploration of Emotion Dynamics Sensing Using Trapezius EMG and Fingertip Temperature
Exploration of the physiological signals associated with subjective emotional dynamics has practical significance. Previous studies have reported that the dynamics of subjective emotional valence and arousal can be assessed using facial electromyography (EMG) and electrodermal activity (EDA), respectively. However, it remains unknown whether other methods can assess emotion dynamics. To investigate this, EMG of the trapezius muscle and fingertip temperature were tested. These measures, as well as facial EMG of the corrugator supercilii and zygomatic major muscles, EDA (skin conductance level) of the palm, and continuous ratings of subjective emotional valence and arousal, were recorded while participants (n = 30) viewed emotional film clips. Intra-individual subjective–physiological associations were assessed using correlation analysis and linear and polynomial regression models. Valence ratings were linearly associated with corrugator and zygomatic EMG; however, trapezius EMG was not related, linearly or curvilinearly. Arousal ratings were linearly associated with EDA and fingertip temperature but were not linearly or curvilinearly related with trapezius EMG. These data suggest that fingertip temperature can be used to assess the dynamics of subjective emotional arousal.
Development of Machine-Learning-Based Facial Thermal Image Analysis for Dynamic Emotion Sensing
Information on the relationship between facial thermal responses and emotional state is valuable for sensing emotion. Yet, previous research has typically relied on linear methods of analysis based on regions of interest (ROIs), which may overlook nonlinear pixel-wise information across the face. To address this limitation, we investigated the use of machine learning (ML) for pixel-level analysis of facial thermal images to estimate dynamic emotional arousal ratings. We collected facial thermal data from 20 participants who viewed five emotion-eliciting films and assessed their dynamic emotional self-reports. Our ML models, including random forest regression, support vector regression, ResNet-18, and ResNet-34, consistently demonstrated superior estimation performance compared to traditional simple or multiple linear regression models for the ROIs. To interpret the nonlinear relationships between facial temperature changes and arousal, saliency maps and integrated gradients were used for the ResNet-34 model. The results show nonlinear associations of arousal ratings in nose = tip, forehead, and cheek temperature changes. These findings imply that ML-based analysis of facial thermal images can estimate emotional arousal more effectively, pointing to potential applications of non-invasive emotion sensing for mental health, education, and human–computer interaction.
Crosstalk in Facial EMG and Its Reduction Using ICA
There is ample evidence that electromyography (EMG) signals from the corrugator supercilii and zygomatic major muscles can provide valuable information for the assessment of subjective emotional experiences. Although previous research suggested that facial EMG data could be affected by crosstalk from adjacent facial muscles, it remains unproven whether such crosstalk occurs and, if so, how it can be reduced. To investigate this, we instructed participants (n = 29) to perform the facial actions of frowning, smiling, chewing, and speaking, in isolation and combination. During these actions, we measured facial EMG signals from the corrugator supercilii, zygomatic major, masseter, and suprahyoid muscles. We performed an independent component analysis (ICA) of the EMG data and removed crosstalk components. Speaking and chewing induced EMG activity in the masseter and suprahyoid muscles, as well as the zygomatic major muscle. The ICA-reconstructed EMG signals reduced the effects of speaking and chewing on zygomatic major activity, compared with the original signals. These data suggest that: (1) mouth actions could induce crosstalk in zygomatic major EMG signals, and (2) ICA can reduce the effects of such crosstalk.
Influence of stimulus manipulation on conscious awareness of emotional facial expressions in the match-to-sample paradigm
The conscious perception of emotional facial expressions plays an indispensable role in social interaction. However, previous psychological studies have reported inconsistent findings regarding whether conscious awareness is greater for emotional expressions than for neutral expressions. Furthermore, whether this phenomenon is attributable to emotional or visual factors remains unknown. To investigate these issues, we conducted five psychological experiments to test the conscious perception of emotional and neutral facial expressions using the match-to-sample paradigm. Facial stimuli were momentarily presented in the peripheral visual fields while participants read simultaneously presented letters in the central visual fields. The participants selected a perceived face from nine samples. The results of all experiments demonstrated that emotional expressions were more accurately identified than neutral expressions. Furthermore, Experiment 4 showed that angry expressions were identified more accurately than anti-angry expressions, which expressed neutral emotions with comparable physical changes to angry expressions. Experiment 5, testing the interaction between emotional expression and face direction, showed that angry expressions looking toward participants were more accurately identified than those looking away from participants, even though they were physically identical. These results suggest that the conscious awareness of emotional facial expressions is enhanced by their emotional significance.
Assessing Automated Facial Action Unit Detection Systems for Analyzing Cross-Domain Facial Expression Databases
In the field of affective computing, achieving accurate automatic detection of facial movements is an important issue, and great progress has already been made. However, a systematic evaluation of systems that now have access to the dynamic facial database remains an unmet need. This study compared the performance of three systems (FaceReader, OpenFace, AFARtoolbox) that detect each facial movement corresponding to an action unit (AU) derived from the Facial Action Coding System. All machines could detect the presence of AUs from the dynamic facial database at a level above chance. Moreover, OpenFace and AFAR provided higher area under the receiver operating characteristic curve values compared to FaceReader. In addition, several confusion biases of facial components (e.g., AU12 and AU14) were observed to be related to each automated AU detection system and the static mode was superior to dynamic mode for analyzing the posed facial database. These findings demonstrate the features of prediction patterns for each system and provide guidance for research on facial expressions.
The widespread action observation/execution matching system for facial expression processing
Observing and understanding others' emotional facial expressions, possibly through motor synchronization, plays a primary role in face‐to‐face communication. To understand the underlying neural mechanisms, previous functional magnetic resonance imaging (fMRI) studies investigated brain regions that are involved in both the observation/execution of emotional facial expressions and found that the neocortical motor regions constituting the action observation/execution matching system or mirror neuron system were active. However, it remains unclear (1) whether other brain regions in the limbic, cerebellum, and brainstem regions could be also involved in the observation/execution matching system for processing facial expressions, and (2) if so, whether these regions could constitute a functional network. To investigate these issues, we performed fMRI while participants observed dynamic facial expressions of anger and happiness and while they executed facial muscle activity associated with angry and happy facial expressions. Conjunction analyses revealed that, in addition to neocortical regions (i.e., the right ventral premotor cortex and right supplementary motor area), bilateral amygdala, right basal ganglia, bilateral cerebellum, and right facial nerve nucleus were activated during both the observation/execution tasks. Group independent component analysis revealed that a functional network component involving the aforementioned regions were activated during both observation/execution tasks. The data suggest that the motor synchronization of emotional facial expressions involves a widespread observation/execution matching network encompassing the neocortex, limbic system, basal ganglia, cerebellum, and brainstem. Conjunction analyses revealed that, in addition to neocortical regions (i.e., the right ventral premotor cortex and right supplementary motor area), bilateral amygdala, right basal ganglia, bilateral cerebellum, and right facial nerve nucleus were activated during both the observation/execution of facial expression tasks. Group independent component analysis revealed that a functional network component involving the aforementioned regions was activated during both the observation/execution tasks. The data suggest that the motor synchronization of emotional facial expressions involves a widespread observation/execution matching network encompassing the neocortex, limbic system, basal ganglia, cerebellum, and brainstem.
Emotional valence sensing using a wearable facial EMG device
Emotion sensing using physiological signals in real-life situations can be practically valuable. Previous studies have developed wearable devices that record autonomic nervous system activity, which reflects emotional arousal. However, no study determined whether emotional valence can be assessed using wearable devices. To this end, we developed a wearable device to record facial electromyography (EMG) from the corrugator supercilii (CS) and zygomatic major (ZM) muscles. To validate the device, in Experiment 1, we used a traditional wired device and our wearable device, to record participants’ facial EMG while they were viewing emotional films. Participants viewed the films again and continuously rated their recalled subjective valence during the first viewing. The facial EMG signals recorded using both wired and wearable devices showed that CS and ZM activities were, respectively, negatively and positively correlated with continuous valence ratings. In Experiment 2, we used the wearable device to record participants’ facial EMG while they were playing Wii Bowling games and assessed their cued-recall continuous valence ratings. CS and ZM activities were correlated negatively and positively, respectively, with continuous valence ratings. These data suggest the possibility that facial EMG signals recorded by a wearable device can be used to assess subjective emotional valence in future naturalistic studies.
Benton facial recognition test scores for predicting fusiform gyrus activity during face perception
Face perception is a fundamental cognitive ability essential for social interactions. The Benton Facial Recognition Test (BFRT) is widely used to assess both normal and impaired face perception. However, whether BFRT scores could reflect neural activity associated with face perception, particularly in relation to holistic or configural face processing, remains unclear. To address this question, we administered the short form of the BFRT and acquired functional magnetic resonance imaging (fMRI) data while participants passively viewed photographs of upright and inverted faces and houses. Regression analyses revealed that BFRT scores positively correlated with fMRI activity in the right fusiform gyrus in response to upright faces versus upright houses. Additionally, BFRT scores were positively associated with right fusiform gyrus activity in response to inverted versus upright faces. These findings suggest that BFRT scores serve as an indicator of right fusiform gyrus activity linked to holistic/configural face perception.