A Brain Wave Encodes a Thousand Tokens: Modeling Inter-Cortical Neural Interactions for Effective EEG-based Emotion Recognition
Nilay Kumar, Priyansh Bhandari, G. Maragatham
2025-11-19
Summary
This paper focuses on recognizing human emotions using brain activity measured by electroencephalograms, or EEGs. It introduces a new deep learning model called RBTransformer that aims to improve the accuracy of emotion recognition by paying closer attention to how different parts of the brain communicate with each other during emotional experiences.
What's the problem?
Currently, many methods for recognizing emotions from EEG data treat the brain as if it's working in isolated parts. However, emotions aren't localized to one area; they involve complex interactions *between* different brain regions over time. Ignoring these interactions can lead to less accurate emotion recognition, because the full picture of how an emotion unfolds isn't being considered.
What's the solution?
The researchers developed RBTransformer, a new neural network architecture based on a 'Transformer' model. This model first converts the raw EEG signals into a format the network can understand, then uses a special mechanism called 'attention' to learn how different electrodes (measuring activity in different brain areas) influence each other. Essentially, it builds a map of brain-to-brain communication. Finally, it uses this map to predict the emotion being experienced, looking at dimensions like valence (positive or negative feeling), arousal (level of excitement), and dominance (feeling of control).
Why it matters?
This work is important because it demonstrates a more sophisticated way to analyze brain activity for emotion recognition. By accurately capturing the dynamic interplay between brain regions, RBTransformer achieves better results than previous methods on several standard datasets. This could have implications for applications like mental health monitoring, human-computer interaction, and understanding neurological disorders where emotional processing is affected.
Abstract
Human emotions are difficult to convey through words and are often abstracted in the process; however, electroencephalogram (EEG) signals can offer a more direct lens into emotional brain activity. Recent studies show that deep learning models can process these signals to perform emotion recognition with high accuracy. However, many existing approaches overlook the dynamic interplay between distinct brain regions, which can be crucial to understanding how emotions unfold and evolve over time, potentially aiding in more accurate emotion recognition. To address this, we propose RBTransformer, a Transformer-based neural network architecture that models inter-cortical neural dynamics of the brain in latent space to better capture structured neural interactions for effective EEG-based emotion recognition. First, the EEG signals are converted into Band Differential Entropy (BDE) tokens, which are then passed through Electrode Identity embeddings to retain spatial provenance. These tokens are processed through successive inter-cortical multi-head attention blocks that construct an electrode x electrode attention matrix, allowing the model to learn the inter-cortical neural dependencies. The resulting features are then passed through a classification head to obtain the final prediction. We conducted extensive experiments, specifically under subject-dependent settings, on the SEED, DEAP, and DREAMER datasets, over all three dimensions, Valence, Arousal, and Dominance (for DEAP and DREAMER), under both binary and multi-class classification settings. The results demonstrate that the proposed RBTransformer outperforms all previous state-of-the-art methods across all three datasets, over all three dimensions under both classification settings. The source code is available at: https://github.com/nnilayy/RBTransformer.