Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
eSEE-d: Emotional State Estimation Based on Eye-Tracking Dataset
by
Manousos, Dimitris
, Skaramagkas, Vasileios
, Fotiadis, Dimitrios I.
, Tripoliti, Evanthia
, Tsiknakis, Manolis
, Ktistakis, Emmanouil
, Tachos, Nikolaos S.
, Kazantzaki, Eleni
in
Analysis
/ Arousal
/ Artificial intelligence
/ Catalysts
/ Datasets
/ emotion classification
/ emotion database
/ Emotional behavior
/ Emotions
/ Eye movements
/ eye tracking
/ Information management
/ Neural networks
/ Self-assessment
/ Usability testing
/ valence
2023
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
eSEE-d: Emotional State Estimation Based on Eye-Tracking Dataset
by
Manousos, Dimitris
, Skaramagkas, Vasileios
, Fotiadis, Dimitrios I.
, Tripoliti, Evanthia
, Tsiknakis, Manolis
, Ktistakis, Emmanouil
, Tachos, Nikolaos S.
, Kazantzaki, Eleni
in
Analysis
/ Arousal
/ Artificial intelligence
/ Catalysts
/ Datasets
/ emotion classification
/ emotion database
/ Emotional behavior
/ Emotions
/ Eye movements
/ eye tracking
/ Information management
/ Neural networks
/ Self-assessment
/ Usability testing
/ valence
2023
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
eSEE-d: Emotional State Estimation Based on Eye-Tracking Dataset
by
Manousos, Dimitris
, Skaramagkas, Vasileios
, Fotiadis, Dimitrios I.
, Tripoliti, Evanthia
, Tsiknakis, Manolis
, Ktistakis, Emmanouil
, Tachos, Nikolaos S.
, Kazantzaki, Eleni
in
Analysis
/ Arousal
/ Artificial intelligence
/ Catalysts
/ Datasets
/ emotion classification
/ emotion database
/ Emotional behavior
/ Emotions
/ Eye movements
/ eye tracking
/ Information management
/ Neural networks
/ Self-assessment
/ Usability testing
/ valence
2023
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
eSEE-d: Emotional State Estimation Based on Eye-Tracking Dataset
Journal Article
eSEE-d: Emotional State Estimation Based on Eye-Tracking Dataset
2023
Request Book From Autostore
and Choose the Collection Method
Overview
Affective state estimation is a research field that has gained increased attention from the research community in the last decade. Two of the main catalysts for this are the advancement in the data analysis using artificial intelligence and the availability of high-quality video. Unfortunately, benchmarks and public datasets are limited, thus making the development of new methodologies and the implementation of comparative studies essential. The current work presents the eSEE-d database, which is a resource to be used for emotional State Estimation based on Eye-tracking data. Eye movements of 48 participants were recorded as they watched 10 emotion-evoking videos, each of them followed by a neutral video. Participants rated four emotions (tenderness, anger, disgust, sadness) on a scale from 0 to 10, which was later translated in terms of emotional arousal and valence levels. Furthermore, each participant filled three self-assessment questionnaires. An extensive analysis of the participants’ answers to the questionnaires’ self-assessment scores as well as their ratings during the experiments is presented. Moreover, eye and gaze features were extracted from the low-level eye-recorded metrics, and their correlations with the participants’ ratings are investigated. Finally, we take on the challenge to classify arousal and valence levels based solely on eye and gaze features, leading to promising results. In particular, the Deep Multilayer Perceptron (DMLP) network we developed achieved an accuracy of 92% in distinguishing positive valence from non-positive and 81% in distinguishing low arousal from medium arousal. The dataset is made publicly available.
Publisher
MDPI AG,MDPI
This website uses cookies to ensure you get the best experience on our website.