Skip to main content
Emotional Intelligence is Listen’s multi-modal analysis layer that detects emotion across voice, tone, language, and video, revealing how facial expressions and word choice together can tell a different story than text alone. Built to industry standards, Emotional Intelligence incorporates Ekman’s six universal emotions framework and is benchmarked against MELD for state-of-the-art accuracy.

How Emotional Intelligence Works

Emotional Intelligence is a multi-modal analysis system that processes three signal types simultaneously:
  • Vocal pitch and tone — Changes in voice that signal excitement, hesitation, discomfort, or certainty.
  • Facial expressions — Visual signals captured from video responses.
  • Word choice — Semantic signals in the language participants use.
These signals combine into emotion scores grounded in Ekman’s six core emotions (Anger, Disgust, Fear, Happiness, Sadness, and Suprise). The model is validated against academic benchmarks (CMU-MOSEI and MELD frameworks).
Emotional Understanding analyzes behavioral signals — vocal patterns, word choice, and observable expressions. It does not collect or analyze biometric data.

Understanding the Emotional Views

Listen gives you several ways to explore emotional data across your study: Emotional Timeline shows how a participant’s emotional state changed throughout an interview. Peaks and valleys correspond to moments of strong positive or negative reaction — use these to pinpoint the exact moments that matter most. Emotion Mapping tracks emotional states over the course of an interview, identifying moments of positive emotion, negative emotion, hesitation, surprise, and more, and maps them to specific interview questions or moments. Aggregate Emotion View gives you a study-level look at the emotional distribution across all participants for a given question or topic, letting you see patterns across your full sample. Emotion-filtered transcript clips let you jump directly to the moments where a specific emotion appeared, so you can watch and hear the context for yourself.

What You Can Learn

Creative and Ad Testing

Tapping into emotional nuance is especially valuable for evaluating creative work. You can identify which executions generated genuine excitement versus polite interest, pinpoint where in a video ad participants disengaged or showed confusion, and see which specific elements drove the strongest emotional response.

Concept Testing

When testing concepts, emotional data reveals which idea generated the most authentic positive reaction. It also helps you distinguish whether a negative reaction was confusion (which is solvable) or genuine dislike (which is more serious). You can run side-by-side emotional comparisons across concepts and markets.

Usability Testing

For usability work, Emotional Intelligence surfaces the exact moments where participants experienced friction or frustration. It also reveals where delight appeared in the user journey — moments you should reinforce — and exposes the gap between verbal satisfaction ratings and actual emotional signals.

Tips for Working With Emotional Data

  • Focus on unexpected emotional moments. Sudden shifts in emotion — from positive to negative or vice versa — often signal the most important insights. Investigate what triggered the shift.
  • Cross-reference with quotes. Emotional signals are hypotheses, not conclusions. Always read the transcript around an emotional moment to understand its context before drawing conclusions.
  • Use emotional data for concept testing. Emotional reactions to stimuli (ads, product concepts, packaging) are especially valuable because participants often can’t articulate why something resonates or doesn’t.
  • Compare emotional responses across segments. Filter by demographic group or screener answer to see if different groups react differently to the same questions.
  • Focus on sustained signals. Strong emotional indicators held for 3–5 seconds are more meaningful than brief micro-expressions.