How Can Driver Emotions Be Quantified?

Apr 12, 2016 · 4 min read
AI generated

How can emotions be quantified?

To describe facial expressions systematically, Paul Ekman and his colleagues developed the Facial Action Coding System (FACS) 1. This system groups visible facial muscle movements and certain head motions into identifiable units called Action Units (AUs). Each AU represents a specific facial muscle activity that can be observed and coded manually or by software.

The table below shows a selection of AUs along with their descriptions and how accurately they were detected by facial recognition software I used in 2016.

Action UnitDescriptionAccuracyN
1Raise inner eyebrow89.7 %175
2Raise outer eyebrow88.9 %117
4Lower brows94.3 %194
5Raise upper eyelids95.1 %102
6Raise cheeks92.7 %123
7Tighten eyelids93.4 %121
9Wrinkle nose100 %75
10Raise upper lip90.5 %21
12Pull lip corners (smile)95.4 %131
14Dimpler67.6 %37
15Depress lip corners89.4 %94
17Raise chin86.6 %202
18Pucker lips88.9 %9
20Stretch lips92.4 %79
23Tighten lips63.3 %60
24Press lips together65.5 %58
25Open mouth76.9 %324
26Drop jaw48 %50
28Suck in lips100 %1

Although there are some individual differences, it is generally assumed that specific facial expressions are universally associated with certain emotions, across people and cultures. The diagram below illustrates how specific combinations of Action Units (AUs) form the prototypical expressions of basic emotions like fear, joy, sadness, and anger2.

How can we read a driver’s face?

The Facial Action Coding System (FACS) offers a systematic way to describe facial muscle movements, and that means we can leverage it to assess driver emotions. But here’s the question:

Which facial muscle movements are linked to frustration during driving?

30 participants were invited to take part in a simulated driving experiment. Each driver faced two contrasting traffic scenarios:

🚗 Smooth traffic
🚗 Traffic jam

To enhance immersion, participants were rewarded for reaching their destination within a set time limit: making delays feel more frustrating and emotionally charged. The image below shows the driving simulator setup and one such frustrating moment, captured inside our virtual traffic jam:

exp

Statistical results revealed that several facial muscle movements (Action Units, AUs) appeared significantly more often during traffic jam than in smooth ones. These frustration-linked AUs include:

  • AU2: Raise outer eyebrows
  • AU5: Raise upper eyelids
  • AU6: Raise cheeks
  • AU9: Wrinkle nose
  • AU10: Raise upper lip
  • AU12: Pull lip corners (smile)
  • AU14: Dimpler
  • AU15: Depress lip corners
  • AU17: Raise chin
  • AU18: Pucker lips
  • AU23: Tighten lips
  • AU28: Suck in lips

What is a frustrated facial expression?

While individual Action Units (AUs) are meaningful, facial expressions are usually combinations of multiple AUs. In fact, the AUs that appeared more frequently during traffic jams may not act alone. They likely combine to form complex expressions. To uncover these hidden combinations, I conducted a clustering analysis on the facial data observed during the jam condition. K-Means Clustering was employed. The core idea is simple: group similar data points based on how close they are in space.

cluster

The K-Means process works in following steps:

  1. Randomly choose initial cluster centers.
  2. Assign each observation to the nearest cluster center.
  3. Update the cluster centers based on the new groupings.
  4. This cycle repeats until stable clusters emerge.

Facial expression data were grouped into five clusters using K-Means clustering. Each cluster represents a common combination of Action Units (AUs), potentially reflecting a distinct emotional pattern. The radar chart above shows the average activation level (cluster centers) for each AU across the five clusters.

Cluster 4 was the only one that appeared significantly more often during traffic jams.

Cluster 4 is mainly defined by the co-activation of:

  • AU 9Wrinkle nose
  • AU 18Pucker lips
  • AU 24Press lips together

This particular combination may represent the facial expression of frustration in the driving context.

What does a “frustrated face” look like?

AU9 + AU18 + AU24 = The Face We’re Looking For


  1. Ekman, P. Friesen, W. V. & Hager, J. C. (2002). Manual for the facial action coding system. Salt Lake City: A Human Face. ↩︎

  2. Ekman, P. & Friesen, W. V. (1978). Manual for the facial action coding system. Palo Alto: Consulting Psychologists Press. ↩︎