Transformers for Emotion Recognition

Challenge

With the advent of transformer models, Emotion Recognition in Conversation (ERC) systems have significantly advanced, yet understanding emotions, beyond simplistic labels such as ‘happy’ or ‘angry’, requires modeling the deeper cognitive processes and situational contexts beyond mere lexical associations.

Approach

In this project, we approach this problem from a spatiotemporal perspective and propose a novel, lightweight framework that completes the 5W paradigm (What, Who, Why, When, Where) for textual ERC. To achieve this, we integrate appraisal theory, a dominant psychological model explaining how emotions arise, into the transformer model architecture.

Outcomes

Our empirical results demonstrate that our model both achieves competitive benchmark performance on standard ERC datasets and also surpasses previous state-of-the-art models.

Project Status

Ongoing

Lead Researchers

Aydin Javadov, Tobias Hoesli