Remember the memory-erasing Neuralyzer in Men in Black? Or more recently, Ex Machina, the Oscar-winning story of a humanoid robot who uses emotional persuasion to outsmart humans and escape from the secluded home of its creator?

While movies have been envisioning crazy new technology for decades, some of these inventions are starting to become reality. To continue the movie theme for a moment, consider all the social media activity last year on “Back to the Future Day,” where people around the world compared the technological advances predicted by the 1985 film with the devices at our fingertips today. From virtual reality and wearable devices to facial and emotional recognition technologies, these products and systems are changing the way we communicate, interact and, of course, conduct marketing research (MR).

Understanding emotions is hugely powerful in Market Research but notoriously difficult to achieve

One of the hottest areas of technology development in MR for 2016 is that of facial and emotion recognition. Understanding emotions is hugely powerful in MR but notoriously difficult to achieve. Facial expressions are strongly linked to emotions, and research organizations have used human observation of recorded videos to try to assess emotional response for many years. Human assessment has many limitations, however, and facial expression recognition technology offers an opportunity to overcome some of these limitations, delivering a much greater level of insight about personal sentiment and reactions.

According to research by Dr. Paul Ekman, a pioneer in the study of emotions and facial expressions and Professor Emeritus of Psychology at the University of California Medical School, brief flashes of emotion displayed on the respondent’s face – or “micro-expressions” – reveal a person’s beliefs and their propensity to act or buy.

The scope for this technology goes beyond pure research. Customer experience leaders have declared 2016 “The Year of Emotion,” continuing the trend for MR and Voice of the Customer (VoC) to become increasingly complementary disciplines. This trend is also likely to fuel demand from enterprises who expect their MR providers to offer the most cutting-edge research technologies.

Emotions drive spending and loyalty. Organizations managing research programs and customer experience activities can use emotion detection technology to analyze people’s emotional reactions at the point of experience. This knowledge not only gives researchers a greater understanding of behavior patterns but also helps predict likely future actions of those consumers.

The result? An unprecedented level of insight into what affects customer emotions. Such valuable information can drive better business decisions, resulting in improved product and service offerings and experiences.

Do we need this? How will we use it?

Marketing researchers are under increasing pressure to deliver real business value to their customers. Adding to that pressure, however, are the ongoing decline of survey response rates and challenges with collecting data from specific demographic groups. The challenge to find ways to complement panels, focus groups and surveys is on and emotion detection provides some real opportunities.

As with many leading-edge technologies, the range of applications out there is vast, but will start from relatively niche or specific beginnings. The primary use case for those researchers implementing emotional detection is advertisement testing. Within a survey, an advertisement can be shown during which the respondent’s webcam will record their reaction. Traditionally, respondents would answer questions about the advertisement, rating it on various scales. While broadly effective in most cases, results are dependent on the respondent’s ability to recall what they’ve just been shown, their interpretation of their own emotions, and their ability to put those emotions into words. Researchers can also observe and record emotions while the video content is being shown, but great skill is required and consistency is difficult to achieve.

Technology that monitors facial expressions bypasses these issues by capturing data as the respondent views a video. With a traditional view-then-report approach, some fleeting emotions may not even be recognized by respondents who are more likely to remember how they felt at more memorable points in the advertisement, especially at the end. If the emotions of the respondent are being observed, the danger is that different observers may interpret emotions differently. Using technology throughout the viewing stage removes these issues, enabling advertisers to understand how the tiniest elements of their video may impact audience response.

How does it work?

There are many technologies available now with many more joining the market. The specific approaches vary but, broadly, they all capture video and then analyze it for facial movements that correspond to emotions, typically based on the human-observed system called Facial Action Coding System (FACS). Facial imaging uses machine learning algorithms to build a huge reference database of expressions against which to judge the face being viewed. It’s not dissimilar to the way which text analytics systems use a large corpus of relevant text in order to “learn” how to categorize particular words, phrases and verbal expressions.

[The technology] uses algorithms to analyze the respondent’s facial expressions and then assess them against a standardized set of emotional responses (anger, joy, surprise, etc.)

With that database in place – a database being built upon with every use – the solution offers a scalable, repeatable and consistent way to identify emotions. It will access the respondent’s camera to capture their changing expressions throughout the duration of the video. It then uses algorithms to analyze the respondent’s facial expressions and then assess them against a standardized set of emotional responses (anger, joy, surprise etc.). An aggregated result is then created for the full respondent set that represents the overriding split of those emotions at key points in the video. Researchers can then compare the aggregate emotional performance of their video clip against a benchmark.

Does it actually work?

It’s a fair question. The ability to use video to recognize, understand and report back on the tiniest facial movements doesn’t sound far away from the Ex Machina humanoid. The short answer, however, is yes, it does work. Dr. Ekman’s research showed that there is universality of expressions and micro-expressions that relate to specific emotional responses. So, using technology to capture those facial movements and analyze them against the benchmark data is hugely powerful. Some tests report an accuracy rate of around 95 percent which is, by any measure, impressive.

using technology to capture those facial movements and analyze them against the benchmark data is hugely powerful. Some tests report an accuracy rate of around 95 percent which is, by any measure, impressive

The technology is already in use by a number of leading firms who’ve been able to refine their advertising campaigns according to respondents’ reactions to test advertisements. Not only does this technology enable marketing teams to create the most effective advertisements possible, it’s also an ideal tool for localizing content for global campaigns by understanding precise elements that resonate with – or alienate – a particular market. Launching a TV or video ad can be very expensive, so there is great value in ensuring that the ad performs as desired before launching it.

Sounds too good to be true . . . what’s the downside?

There are some restrictions, of course. First, if you’re looking at bringing emotion detection into your MR arsenal, consider the global nature of your programs. People from different nationalities and cultures have different levels of emotional response, not to mention different facial structures, so your benchmark data needs to take this into account. You’ll need to make sure that your provider has worked with respondents in your key regions.

A second issue to consider is that of content delivery. While many people are now used to engaging with video content through a variety of media, including mobile phones and tablets, facial recognition technology requires a two-way view. Your respondents, then, must not only be able to clearly view your content, but also be in a position and environment where their camera can capture their expressions clearly. Different lighting levels, different angles of viewing and changes to either during the capture process all need to be taken into account. On a laptop, which almost always has a built-in webcam, this is a fairly simple task; people tend to be sitting and giving the large screen their full (or almost full) attention. On a mobile device, however, chances are that your respondent may be moving around or may shift the angle at which they’re holding the device. This will reduce the clarity with which you are able to view their face, hindering the technology’s ability to read those critical expressions.

Finally, there’s the privacy issue to contend with.1 This problem is easy to solve, but you will need to specifically ask respondents for permission to access their webcam and record their faces while they’re watching your content. For many, this won’t be an issue, but if you’re specifically targeting an older demographic, for example, they may think you’ve shifted from curious to creepy. In such cases, you may be on safer ground to show video content and ask questions instead of observing their expressions.

So, where does this take marketing research?

Like every next “big thing,” emotion detection software simply adds to the toolkit available to the experienced marketing researcher. It may further reduce the need for focus groups, but beyond that, it’s an addition, not a replacement. Such videos will, in most cases, be embedded in our old friend, the survey, and additional information will be required to understand more about the respondents themselves.

No doubt new applications of the technology will emerge in both MR and customer experience disciplines, some of which will fly and some of which won’t. As with most advances of the last decade (mobile, social analytics, text analytics, beacon technologies and others), emotion detection will find its place and help forward-thinking researchers to continue to add value to the services they provide to their customers.


1 The Marketing Research Association represents the research profession’s interests in a multistakeholder process on facial recognition privacy, and presented a white paper on MR applications of the technology on February 6, 2014: http://www.insightsassocation.org/article/facial-recognition-privacy-kic...