“Apple’s new iPhone X will let you control the poo emoji with your face”. This was all over the UK tabloid newspaper headlines this morning. It’s true – the newly announced iPhone X will allow you to control a range of animojis with your face. But the emotional recognition capability that sits behind this is genuinely exciting and offers potential that extends far beyond the newly grimacing poo emoji.
In 2016, Apple bought Emotient, a tech company that uses AI to read people’s facial expressions and analyse their emotions. According to Emotient’s own website, one of the technology’s applications is to measure customers’ “unfiltered emotional response to ads, content, products and customer service or sales interactions.”
In the world of insight, understanding emotions is both fundamental and incredibly difficult. Currently, exploring the emotions people feel in relation to some form of stimulus (an ad, a concept, a product) mainly relies on self-reporting – be it ticking a box or circling a face that best represents how you feel, or being hooked up to biometric measurement devices in a lab. The new iPhone X opens a whole world of new possibilities. Emotional recognition will leave the lab and become truly mobile. And in 5 years’ time, it will likely be in the hands of more than just early adopters. As uptake of this technology grows, quantitative concept testing should be able to include emotional recognition as standard.
And excitingly for us at FreshMinds, emotional recognition could have applications for passive tracking. Will it be possible to get consent to identify and track the things that genuinely made someone smile? Or made them cry to get us and our clients even closer to the moments that matter?