Apple’s new iPhone X uses face recognition to allow users to unlock their mobile devices simply by looking at them. However, the technology may have much broader and more profound impact when used as a face emotion detector that could give marketers unprecedented insights into the mindset of individual consumers.
By turning the iPhone X’s face recognition system into a face emotion monitor, the technology could detect minute changes in user expression that convey their reaction to the content they are viewing. Such expressions could be used to gauge how receptive consumers are to pitches and allow marketers to adjust their messaging accordingly, according to an opinion article appearing in Ad Age.
“(A) promising scenario for the world’s best marketers is that consumers will be willing to share their facial expressions and emotional analysis for the convenience, tailoring and rewards it offers them,” wrote Grant Owens, CSO at Critical Mass.
Owens sees major potential for face emotion technology in a broad variety of consumer-facing activities.
“Imagine if digital-led customer service functions were informed and prioritized according to your true emotional state, allowing a brand to respond to you more appropriately,” he noted. “Or imagine a financial service firm that knows how you’re feeling and helps you avoid a frivolous purchase or lets you know when your skepticism may be holding you back from something of true value. Imagine launching something in beta and not just hearing from a small, angry faction posting 1-star reviews, but surveying, in real-time, the aggregate joy that the vast majority of your customers are experiencing.”
Owens predicts that the rise of face emotion technology will cause marketers to shift away from click-path analysis techniques that they now use to determine digital users’ goals and interests and predict their future actions. With the arrival of the iPhone X, emotional-path analysis could usurp click-path analysis as the primary means of evaluating consumer sentiment.
Currently, the iPhone X stores its face recognition data internally and it doesn’t send out the information to servers. However, app makers can use the iPhone X to view users and perform face emotion analysis.
This may open the door to create new experiences using face emotion.
“For years, the advertising industry has been consumed with making digital experiences more human,” Owens wrote. “It started with mobile devices, touch screens, and intuitive taps, and it has evolved all the way to natural language-based AI. Now we’re faced with designing for faces—an incredibly complex and uniquely human feature, but also one that could tell us more about our customers than we’ve ever known.”
Nirel Marofsky is project analyst for the cognitive engine and application ecosystem at Veritone. She acts as a liaison to strategic partners, integrating developers and their capabilities into the Veritone Platform. Learn more about our platform and join the Veritone developer ecosystem today.