Next-Generation Emotion Recognition

NeuroFeel

Bridging laboratory precision with real-world emotion recognition through advanced domain adaptation

Our revolutionary framework brings together physiological sensing and machine learning to deliver consistent emotion recognition across different environments and datasets.

WESAD Personalization

Our personalized approach delivers exceptional accuracy through adaptive model selection, achieving 88.16% accuracy in discrete emotion recognition.

Accuracy88.16%
Learn More

Cross-Dataset Transfer

Our pioneering framework bridges laboratory and real-world environments, achieving 80.27% valence recognition accuracy across different datasets.

Domain Gap Reduction89%
Learn More
Our Approach

Revolutionizing Emotion Recognition Across Environments

NeuroFeel tackles the fundamental challenge of making emotion recognition systems work consistently across different environments and datasets

Cross-Dataset Analysis
Breaking free from single-dataset limitations

Our framework systematically combines WESAD (lab-controlled) and K-EmoCon (in-the-wild) datasets, enabling emotion recognition across different data collection contexts.

2 Datasets
26 Subjects
Domain Adaptation
Advanced techniques to bridge the domain gap

Our novel ensemble approach combines CORAL, subspace alignment, and feature scaling to significantly reduce the domain gap between laboratory and real-world settings.

3 Techniques
89% Gap Reduction
Bidirectional Evaluation
Train anywhere, test anywhere

Our framework uniquely provides comprehensive bidirectional evaluation, allowing models to be trained on one dataset and tested on another in both directions.

Bi-directional
Balanced Metrics
Our Frameworks

The Science Behind NeuroFeel

Two complementary frameworks that represent the future of emotion recognition technology

WESAD Personalization

Adaptive Model Selection

Dynamically selects between base and personal models based on confidence thresholds

Four-Model Architecture

Base, Personal, Ensemble, and Adaptive models working in harmony

Minimal Calibration

Achieves high accuracy with very limited personalization data

Model Accuracy Comparison
Base Model86.36%
Personal Model84.51%
Ensemble Model87.52%
Adaptive Model88.16%

Adaptive Selection Architecture

Base Model
Universal patterns from all subjects
Personal Model
Tailored to individual
Confidence Analyzer
Dynamic selection based on prediction confidence
Low ConfidenceHigh Confidence
Adaptive Model (88.16%)
Optimal performance across all subjects
NeuroFeel's adaptive model selection visualized
Key Findings

Breakthrough Research Results

NeuroFeel represents a significant advancement in emotion recognition technologies

Personalization Success
Superior personalized emotion recognition

Our adaptive personalization framework achieved 88.16% accuracy with a +1.80% improvement over base models, demonstrating the effectiveness of our confidence-based selection approach.

88.16%
Adaptive Accuracy
+1.80%
Over Base Model
Domain Gap Reduction
Bridging laboratory and real-world environments

Our ensemble adaptation approach successfully reduced the domain gap by 89% between laboratory and real-world datasets, with significant performance improvements in cross-dataset emotion recognition.

89%
Gap Reduction
80.27%
Valence Accuracy
Transferable Features
Discovering universal emotion markers

We identified key physiological features that transfer well between datasets, with ECG/HR features showing consistently high importance for cross-dataset emotion recognition.

ECG/HR
Top Signal Type
9 Features
Common Features

Applications & Future Directions

Practical Applications

  • Wearable emotion sensing devices with consistent performance
  • Mental health applications with reduced calibration requirements
  • Affective computing systems with improved generalizability
  • Human-computer interaction with consistent emotion recognition

Future Research Directions

  • Expanding to additional physiological datasets and sensor modalities
  • Integrating visual and audio emotion cues with physiological signals
  • Enhancing domain adaptation with self-supervised approaches
  • Developing real-time adaptation for continuous emotion monitoring
Interactive Demo

Experience NeuroFeel in Action

Explore our interactive demonstration to see domain adaptation and personalization in real-time

Live Framework Demo

Our interactive demo allows you to explore both frameworks side by side, comparing performance and visualizing the domain adaptation process in real-time.

Test the WESAD personalization framework on different subjects
Visualize cross-dataset transfer learning in both directions
Compare model performances across different approaches
Launch Interactive Demo
Interactive demo of NeuroFeel frameworks
Research Data

Foundational Datasets

NeuroFeel builds upon these key emotion recognition datasets

WESAD Dataset
Wearable Stress and Affect Detection
Environment
Laboratory
Subjects
15
Emotion Model
Discrete
Signal Quality
High (700Hz)

WESAD is a multimodal dataset for wearable stress and affect detection featuring physiological and motion data recorded from both wrist and chest-worn devices in a controlled laboratory environment.

ECGEDAEMGRESPMulti-modal
Schmidt, P., Reiss, A., Duerichen, R., Marberger, C., & Van Laerhoven, K. (2018). Introducing WESAD, a multimodal dataset for Wearable Stress and Affect Detection. ICMI 2018, Boulder, USA.
K-EmoCon Dataset
Continuous Emotion Recognition in Naturalistic Settings
Environment
In-the-wild
Sessions
16
Emotion Model
Dimensional
Signal Quality
Variable (4Hz)

K-EmoCon is a multimodal dataset with comprehensive annotations of continuous emotions during naturalistic conversations, featuring audiovisual recordings, EEG, and peripheral physiological signals.

HREDAEEGSocial InteractionMulti-perspective
Park, C. Y., Cha, N., Kang, S., Kim, A., Khandoker, A. H., Hadjileontiadis, L., Oh, A., Jeong, Y., & Lee, U. (2020). K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations. Scientific Data, 7(1), 293.
NeuroFeel Project

Ready to Experience the Future of Emotion Recognition?

Explore our interactive demos and see how NeuroFeel bridges the gap between laboratory precision and real-world applications in emotion recognition technology