Newsletter
|
Feb 21, 2025
Gaming could benefit from measuring human emotion
Copy Link
Copy Link
Affective computing is a field of study focused on allowing technology to better understand human emotion. The MIT Media Lab has a department dedicated to this study, part of which is focused on “enabling robots and computers to respond intelligently to natural human emotional feedback”. This concept may sound like science fiction, but recently we have seen a number of companies attempting to interpret brain data, facial expressions, and other biosignals to infer a person's emotions.
This week we want to explain why measuring emotion is difficult, what doing it successfully might look like, and where consumer adoption will likely emerge.
Measuring an emotion has been a topic of interest in psychology since as far back as 1855, with Charles Darwin’s, “Expression of the Emotions in Man and Animals” being one of the prominent writings of that era of thinking. Over the years, other psychologists have debated what causes emotions and have largely settled into two primary styles of thinking (Gendron M, Barrett LF). More recently, in 2017, a third, controversial theory emerged:
While the distinction may seem trivial, the difference between universal innate responses and ones more driven by a person's context changes how we think about measuring emotions. For example, if emotions are directly tied to their physical expressions (which basic emotion theory suggests), then we can measure happiness directly by smiles. Similarly, you could use a quantitative measure (e.g. brainwaves, facial muscle contortion, heart rate, etc.) to determine when someone exhibits anger. However, if emotions are instead constructed based on context, you can be less sure what each measurement actually means.
To put this into perspective, Lisa Feldman Barrett gives the following example in her book How Emotions Are Made:
“Take a look at the woman [below], who is screaming in terror. Most people who were born and raised in western culture can effortlessly see this emotion on her face…”
Except she is not feeling terror. If you flip to the end of the newsletter, you can see that this is actually Serena Williams after winning the 2008 US Open. Barrett believes that context matters so much, that this is the reason we have seen so many mixed results when trying to measure emotion by any purely measurable means (e.g., brainwaves, facial recognition, or biometrics).
To simplify hundreds of years of debate in emotion to this simple example does not do it justice, and to be clear, this perspective is still hotly debated among researchers. However, for the sake of this article and thinking about how emotion can be measured and analyzed, we will operate under the assumption that Construction Theory is correct, that emotions and the way they are expressed are NOT universal among cultures, individual people, or even different experiences, and instead that emotions are our subconscious interpretation of our current situation (and physical state) in the broader context of our life experience.
The concept of measuring emotion is not new. Solsten, a Konvoy portfolio company, provides insights into player emotions as part of its comprehensive approach to understanding players through psychometric assessments. Recently, we have seen approaches focused on more real-time and physiological measurement. The methodology that we have seen many startups use to “measure emotion” is called the Circumplex Model of Affect (see below), which measures affect.
The American Psychological Association defines affect as “any experience of feeling or emotion, ranging from suffering to elation, from the simplest to the most complex sensations of feeling, and from the most normal to the most pathological emotional reactions.” This two-dimensional chart (seen below) plots how pleasant or unpleasant a sensation is on the horizontal axis, and degree of activation on the vertical axis. Note, the Circumplex Model does not strictly measure emotion but instead encompasses all types of feelings and sensations experienced by a person.
While this does not directly correspond to emotion (because it lacks context), many of the startups we have seen in the space argue that mapping against the Circumplex Model can get you directionally close to identifying an actual emotion being experienced. Then, in the spirit of Construction Theory, by layering in additional context, you can be accurate enough for a consumer product. For example, suppose you are a company observing a person playing a video game. If you can identify an unpleasant feeling via facial expression, very high activation via sweat gland secretion, and also get data from the game which indicates that the player died on their seventh attempt at defeating a Dark Souls boss, you can conclude that the player is angry.
The more context that you can layer into the analysis, the more likely you are to be able to derive a more accurate interpretation of a person's emotion. In the example above, if we had known that Serena was exhibiting certain facial expressions in the context of a tennis match, perhaps with a certain degree of eye dilation, and all of this is mapped to previous examples of “excitement”, we could potentially ascertain her emotional state numerically. This would be similar to how a human can identify another person's emotions in the context of the situation, and that ability likely improves the more that you know the person in question.
Note: There are many ongoing debates as to whether or not you can ever measure emotion or how accurately it can ever be done. So, while we make some assumptions, it is important to note that this “science” can not be taken as fact and will continue to evolve in the same way that it has for hundreds of years. Some debates revolve around the ability to accurately measure activation or positive / negative feelings, some revolve around the difficulty of validating a measurement without biasing the outcome, and some even question the usefulness of these insights even if we can get the rest right.
In addition to finding a use case that can provide additional context, we also wanted to hone in on use cases that make this data actionable. Unlike other health data such as Whoop’s “Recovery Score”, which recommends if a user should workout and how hard, emotion data is not inherently actionable. Just knowing that you are angry does not immediately help you radically change the way you live. Below, we attempt to highlight two areas that this technology could emerge where it is both highly actionable and can be interpreted with additional context: video games and health applications.
Video games are a particularly interesting place to capture context because you can link any type of physical biomarkers to the game and keep the user in a stable environment to mitigate additional disruptive signals that could disrupt the measurement process. However from an "actionability" perspective, it is hard to pinpoint the killer applications that would drive player adoption of an affective computing product in gaming.
While content or non-playing characters (NPCs) that react to a player’s emotions to increase immersion could be compelling, we have yet to see a mainstream implementation of real-time reactive environments to date; layering in emotion would be an extra technical hurdle. Not to mention, it is unclear if this would be adopted by a large audience, be a novelty for a few genres, or at worst, be perceived as gimmicky. However, there are signs of promise.
Elloitt Hedman, a previous research assistant at the MIT Media Lab, and founder of mPath, a consulting company that helps companies like the LEGO Group and McGraw Hill empathize with and build for their users, said, “By combining sensor and contextual data, we’ve been able to see hidden layers in the gaming experience. We can see exact moments when players become bored, feel overwhelmed, and reengage.”
Adding emotion data inside of a health application like Whoop would be beneficial because these applications already have a user's health data for context. Now when a user knows that they are angry, they can correlate that data to poor sleep or fitness patterns, and derive actionable insights to make changes. In the moment, they can even be encouraged to take intervening actions to adjust their emotional state (taking a walk or meditating).
At the very least, users could be warned not to engage in activities where their emotional state could have strong negative impacts (e.g., writing that regrettable inflammatory email). This could radically change the relationship between how people interpret and understand their own emotional wellbeing.
We are optimistic about the future of measuring emotion and its integration into our daily lives. However, we believe that health incumbents are best positioned to bring this technology to consumers given their hardware distribution advantage, existing data pools that would make emotional data more actionable, and their established user base of health-minded individuals. That said, we also believe there could be breakout potential for a startup that comes up with a killer use case that drives consumers to their hardware, and we would be excited to speak to the founders ready to take on that fight.
Takeaway: Affective computing is making strides in measuring human emotion. While video games could prove to be a great incubator for this technology, it would need a killer use case, game, or technological innovation in order to overcome the distribution advantages and data troves of incumbents. Today, the best hope for widespread consumer adoption lies in integrating this data with other health metrics to drive actionable insights and change the way people understand and interpret their own emotional health.