Some of us have our feelings written all over our faces. Others may pride themselves on being inscrutable. However, when a computer is analyzing our features frame by frame, it can glean insight from even the slightest quirk.
At last week's Sentiment Analysis Symposium in New York, Jacob Whitehill, a research scientist with Emotient, demonstrated the company's emotion recognition products. He showed how they isolate the faces in a video stream and track their expressions, from joyful to angry to sad. "It has many commercial applications," he said.
Emotient provides an API that enables real-time emotional analysis and offers highly accurate readings of positive, negative, and neutral emotions based on cognitive science, machine learning, and computer vision, Whitehill said. Mining large datasets of facial expressions, it can find patterns and sometimes even predict the way people will react to given stimuli. In addition to wide smiles and angry nostril flares, the software detects "microexpressions" like flashes of disgust or contempt.
According to Emotient's website, the API measures 28 facial action units, including eyebrow raises, nose wrinkles, lip curls, and jaw drops.
The obvious use case is for focus groups, with the software noting positive and negative reactions far more quickly and comprehensively than human observers. In research for consumer packaged goods, Whitehill said, facial analysis was a more accurate predictor of "proclivity to buy" than self-reporting by the subjects. It wasn't so much that certain package designs evoked huge smiles, Whitehill pointed out. "Lack of negative reaction was a strong predictor."
Read the rest of this article on All Analytics.