Affective Analytics and the Human Emotional Experience
Whether you are happy, grumpy, excited, or sad, a computer wants to know and can initiate affective analytics.
Emotion is a universal language. No matter where you go in the world, a smile or a tear is understood. Even our beloved pets recognize and react to human emotion. What if technology could detect and respond to your emotions as you interact and engage in real time? Numerous groups are evaluating affective computing capabilities today. From marketing and healthcare to customer success or human resource management, the use cases for affective analytics are boundless.
What is affective analytics?
Affective computing humanizes digital interactions by building artificial emotional intelligence. It is the ability for computers to detect, recognize, interpret, process, and simulate human emotions from visual, textual, and auditory sources. By combining facial expression data with physiological, brain activity, eye tracking, and other data, human emotion is being evaluated and measured in context. Deep learning algorithms interpret the emotional state of humans and can adjust responses according to perceived feelings.
As natural language interactions with technology continue to evolve, starting with search, bots, and personal assistants like Alexa, emotion detection is already emerging to advance advertising, marketing, entertainment, travel, customer experience, and healthcare. Affective analytics provides much deeper, quantitatively measured understanding of how someone experiences the world. It should improve the quality of digital customer experiences and shape our future relationship with robots. Yes, the robots are coming.
{image 1}
Rosalind Picard, a pioneer of affective computing and Fellow of the IEEE for her momentous contributions, wrote a white paper on these capabilities more than 20 years ago. Affective analysis can be used to help patients with autism, epilepsy, depression, PTSD, sleep, stress, dementia, and autonomic nervous system disorders. She has multiple patents for wearable and non-contact sensors, algorithms, and systems that can sense, recognize, and respond respectfully to humans. Technology has finally advanced to a point where Picard’s vision can become a reality.
Measuring emotion
Today numerous vendors and RESTful APIs are available for measuring human emotion, including Affectiva, Humanyze, nViso, Realeyes, Beyond Verbal, Sension, CrowdEmotion, Eyeris, Kairos, Emotient, IBM Watson Tone Analyzer, AlchemyAPI, and Intel RealSense. Sensor, image, and text emotion analysis is already being integrated into business applications, reporting systems and intelligent things. Let’s delve into how feelings are calculated and predicted.
For example, Affectiva computes emotions via facial expressions using a metric system called Affdex. Affdex metrics include emotion, facial expression, emojis, and appearance. The measures provide insight into a subject’s engagement and emotional experience. Engagement measures analyze facial muscle movement and expressiveness. A positive or negative occurrence is predicted by observing human expressions as an input to estimate the likelihood of a learned emotion. The combination of emotion, expression, and emoji metric scores are analyzed to determine when a subject shows a specific emotion. The output includes the predicted human sentiment along with a degree of confidence.
Emotionally-aware computing
Realistically, affective analytics does open up a whole new world of insights and enhanced human–computer interactions. However, this science will be fraught with errors. Humans are not machines. Understanding human emotions is a complex skill that few humans have mastered. Emotions could be lurking long before an encounter or be triggered by unrelated thoughts during an interaction. Many subjects will likely alter behavior if they know emotion is being measured. Do you react differently when you know you are being recorded?
As affective analytics matures, I expect challenges to arise in adapting algorithms for a wide variety of individuals and cultures that show, hide, or express emotions differently. How will that be programmed? Collecting and classifying emotion also raises unique personal privacy and ethical concerns.
A future with emotionally-aware machines in our daily lives will be fascinating. We all might get awakened or shaken by affective computing. With machines becoming emotionally intelligent, will humans evolve to be less open, genuine or passionate? Affective computing will undoubtedly change the human-computer experience. It likely also will alter our human experiences.
About the Author
You May Also Like