Do you feel me?
Just as once-novel voice recognition technology is now a ubiquitous part of human–machine relationships, so too could mood recognition technology (aka “affective computing”) soon pervade digital interactions.
Through the application of machine learning, Big Data inputs, image recognition, sensors, and in some cases robotics, artificially intelligent systems hunt for affective clues: widened eyes, quickened speech, and crossed arms, as well as heart rate or skin changes.
Emotions are big business
The global affective computing market is estimated to grow from just over US$9.3 billion a year in 2015 to more than $42.5 billion by 2020.
Source: “Affective Computing Market 2015 – Technology, Software, Hardware, Vertical, & Regional Forecasts to 2020 for the $42 Billion Industry” (Research and Markets, 2015)
Customer experience is the sweet spot
Forrester found that emotion was the number-one factor in determining customer loyalty in 17 out of the 18 industries it surveyed – far more important than the ease or effectiveness of customers’ interactions with a company.
Humana gets an emotional clue
Source: “Artificial Intelligence Helps Humana Avoid Call Center Meltdowns” (The Wall Street Journal, October 27, 2016)
Insurer Humana uses artificial intelligence software that can detect conversational cues to guide call-center workers through difficult customer calls. The system recognizes that a steady rise in the pitch of a customer’s voice or instances of agent and customer talking over one another are causes for concern.
The system has led to hard results: Humana says it has seen an 28% improvement in customer satisfaction, a 63% improvement in agent engagement, and a 6% improvement in first-contact resolution.
Spread happiness across the organization
Source: “Happiness and Productivity” (University of Warwick, February 10, 2014)
Employers could monitor employee moods to make organizational adjustments that increase productivity, effectiveness, and satisfaction. Happy employees are around 12% more productive.
Walking on emotional eggshells
Whether customers and employees will be comfortable having their emotions logged and broadcast by companies is an open question. Customers may find some uses of affective computing creepy or, worse, predatory. Be sure to get their permission.
Other limiting factors
The availability of the data required to infer a person’s emotional state is still limited. Further, it can be difficult to capture all the physical cues that may be relevant to an interaction, such as facial expression, tone of voice, or posture.
Get a head start
Discover the data
Companies should determine what inferences about mental states they want the system to make and how accurately those inferences can be made using the inputs available.
Work with IT
Involve IT and engineering groups to figure out the challenges of integrating with existing systems for collecting, assimilating, and analyzing large volumes of emotional data.
Consider the complexity
Some emotions may be more difficult to discern or respond to. Context is also key. An emotionally aware machine would need to respond differently to frustration in a user in an educational setting than to frustration in a user in a vehicle.
To learn more about how affective computing can help your organization, read the feature story Empathy: The Killer App for Artificial Intelligence.