Is Facial Recognition in Retail Market Research the Next Big Thing?

Remember the memory-erasing Neuralyzer in "Men in Black"? Or more recently, "Ex Machina," the Oscar-winning story of a humanoid robot that uses emotional persuasion to outsmart humans and escape from the secluded home of its creator?

While movies have been envisioning crazy, new technology for decades, some of those inventions are starting to become reality. From virtual reality and wearable devices to facial and emotional recognition technologies, these products and systems are changing the way we communicate, interact and conduct market research (MR) in several industries, most notably, retail.

One of the hottest areas of technology development in retail research is facial and emotion recognition. Understanding emotions is powerful in areas of research such as ad testing, but difficult to achieve. Facial expressions are linked to emotions, and research organizations have used human observation of recorded videos in retail settings to try to assess emotional response for years. Human assessment has many limitations, and facial expression recognition technology offers an opportunity to overcome some of these limitations, delivering a much greater level of insight about personal sentiment and reactions.

Organizations managing research programs and retail customer experience activities can use emotion detection technology to analyze people’s emotional reactions at the point of experience. This knowledge not only gives researchers a greater understanding of behavior patterns, but also helps predict likely future purchasing actions of that consumer.

The result? Remarkable insight into what impacts customer emotions, as well as valuable information that can drive better business decisions, resulting in improved product and service offerings and experiences.

Do we need this? How will we use it?

Competition only continues to grow with retail, making experiences more important than ever. Thus, market researchers are under increasing pressure to deliver business value to their customers. Adding to that pressure is ongoing declining survey response rates and challenges with collecting data from specific demographic groups. Emotion detection provides real opportunities to drive customer spending and enhance loyalty.

The primary use case for those researchers implementing emotional detection is ad testing. Within a survey an advertisement can be shown, during which time, the respondent’s webcam will record their reaction.

Traditionally, respondents will answer questions about the advertisement they’ve been shown, rating it on various scales. While broadly effective in most cases, this is dependent on the respondent’s ability to recall what they’ve just been shown, their interpretation of their own emotions, and their ability to put those emotions into words. Researchers can also observe and record emotions while the video content is being shown, but this needs specific skills and is difficult to perform consistently.

Does it work?

The ability to use video to recognize, understand and report back on the tiniest facial movements doesn’t sound far away from the "Ex Machina" humanoid. Research shows there is a broad array of expressions and micro-expressions that relate to specific emotional responses, and so using technology to capture those facial movements and analyze them against the benchmark data is hugely powerful. Some tests report an accuracy rate of around 95%, which by any measure is impressive.

The technology is already in use by a number of retailers, who have been able to refine their advertising campaigns according to respondents’ reactions to test adverts.

What’s the downside?

If you’re looking at bringing emotion detection into your arsenal, consider the global nature of your programs. People from different nationalities and cultures have different levels of emotional response, and different facial structures, so your benchmark data needs to take this into account.

A second issue to consider is how to deliver your content. While many people are now used to engaging with video content through a variety of media, facial recognition technology requires a two-way view. This means that not only must your respondent be able to clearly and effectively view your video content, they must be in a position where their camera is capturing their expressions clearly. Different lighting levels and different angles of viewing all need to be taken into account.

Finally, you will need to specifically ask respondents for permission to access their webcam and record their faces while they’re watching your content. For many, this won’t be an issue, but if you’re specifically targeting, for example, an older demographic who may think you’ve shifted from curious to creepy, then you may be on safer ground to show video content and ask questions instead of observing their expressions.


Emotion detection software simply adds to the toolkit available to retailers who are looking to improve their customer experiences and create more effective advertising campaigns. It may further reduce the need for focus groups, but beyond that, it’s an addition, not a replacement. Such videos will, in most cases, be embedded into a survey, and additional information will be required to understand more about the shoppers themselves.

No doubt new applications of the software will emerge in both MR and customer experience disciplines – some of which will fly and some of which won’t. As with most advances of the last decade, emotion detection will find its place and help forward-thinking retailers add additional value to the services they provide to their customers. In turn, this will ideally help progress the retail space, helping retailers take their ability to provide customer service and implement customer feedback to the next level.

Terry Lawlor is executive VP of product management at Confirmit, where he is responsible for all aspects of product management, including strategy development, product definition, and product representation in client and marketing activities.