Eyes and Minds: Unveiling Emotional Recognition through Gaze and Brain Activity

Researcher(s)

  • Brishna Nazari, Computer Science, University of Delaware

Faculty Mentor(s)

  • Roghayeh Barmaki, , University of Delaware

Abstract

Abstract

Understanding emotional recognition is essential for improving social interactions and developing effective AI systems. This study examines emotional recognition by analyzing gaze data and brain signals. Using a game based on the “Reading the Mind in the Eyes Test,” participants answered questions about emotions displayed in photographs while their eye movements and brain signals were recorded. We implemented two game versions: Game One involves 18 questions without feedback followed by 18 questions with feedback, and Game Two consists of 18 questions with feedback followed by 18 questions without feedback, both preceded by a practice question. Our hypothesis is that participants receiving feedback later (Game One) will display more focused gaze patterns and quicker response times compared to those receiving feedback initially (Game Two). Preliminary findings support this hypothesis, indicating that participants in Game One showed more focused gaze patterns and quicker response times. Additionally, brain activity varied depending on the feedback timing. These results suggest that the timing of feedback significantly impacts emotional recognition and cognitive processing. Our study provides insights into how gaze patterns and brain activity contribute to emotion recognition, which can enhance tools for teaching emotions and the design of AI systems. This research bridges the gap between emotion recognition studies and practical applications in AI and educational tools.