Qual papel Inteligência Artificial e Emoção vão jogar na publicidade digital do futuro?

10 mins read

The idea of computers sensing and responding to human emotions and attitudes might sound like science-fiction, but the empathetic technology enabling this is coming along faster than you might think.

Companies such as Amazon, IBM, and Walmart are researching ways to combine biosensors with artificial emotional intelligence (AEI) to detect sentiments that might influence consumer purchase decisions. Meantime, publishers like ESPN, The New York Times, and USA Today, among others, are currently developing and distributing advertising tools for matching ads to consumers moods.

This is just the beginning, according to analysts. Indeed, Gartner, in its annual strategic predictions, posits that by 2024, AI identification of emotions will influence more than half of all online ads people  view.

“We’re certainly seeing the groundwork being laid for more AEI-informed campaigns,” said Kristina LaRocca-Cerrone, a director with Gartner for Marketers. “In fact, we view AEI-based advertising as the next frontier for better consumer engagement and deeper customer understanding.”

Dipanjan Chatterjee, vice president and principal analyst at Forrester, agreed.

“Over the last decade, the discipline of marketing has been waking up to a far more complex and sophisticated understanding of the role of emotions than the simplistic, tug-at-your-heartstrings, garden variety of advertising routinely employed by the ‘Mad Men’ of the industry,” he told CMO by Adobe.


Artificial emotional intelligence

“We now have the substantiation to conclude that emotions comprise a majority of the glue that binds consumers to brands.”
DIPANJAN CHATTERJEE, VP AND PRINCIPAL ANALYST, FORRESTER.

 

How Emotion Detection Works

AEI is essentially an offshoot of AI, where computer algorithms enable machines to closely observe, learn from, and mimic human behavior. While the technology behind AEI has grown rapidly in recent years, the concept itself–also known as “affective computing”–dates back to 1995 when MIT Media Lab professor Rosalind Picard published a widely read paper on the subject.

Today, emotion detection and recognition work through a confluence of facial recognition, voice pattern, and deep-learning analysis technologies that are, arguably, still in their early stages of development, even though the global market is already expected to be worth nearly $25 billion by next year.

For marketers and advertisers, it’s thought these technologies will eventually help brands predict how groups or individuals might react–positively or negatively–to messages delivered through ads at different stages of the customer journey.

For instance, if a consumer just booked a European river cruise and is looking at the online itinerary the next day, AEI could, theoretically, take note of related online comments, facial expressions, or body movements, conclude the person is excited, and trigger a text message promoting fun excursions. Similarly, AEI tools could help marketers avoid insensitive outreach, such as sending consumers upbeat personalized ads when they might be crying.

AEI could also be used to help brands detect and respond when customers seem irritated with any aspect of their purchase experience. Amazon Alexa, for example, will be adding frustration detection capabilities next year that will use tone of voice to determine when people become annoyed with her.

The Complexity Of Reading Emotions

Presumably, if you’re able to collect enough empirical data about how people might react to various emotional stimuli at any given time, you could automatically customize ads and send them out to the right people at the right times–at scale. But achieving that is far more difficult than it sounds because human emotions are incredibly complex and vary widely by situation and the individuals involved.

For example, a blank stare by one person might suggest boredom or apathy. For another, it might be an expression of anger. For someone else, it might mean they weren’t listening, didn’t hear a statement, or had no reaction to it. What’s more, those expressions could be influenced by age, gender, culture, sexual orientation, and a host of other factors.

The human brain does a fair job of dealing with such nuances. But algorithms do not–at least not yet. They need huge data stores of images to compare against one another and, even then, often have to be trained to recognize subtleties, leading some researchers to question its plausability.

Recently, a group of senior scientists organized by the Association of Psychological Science (APS) completed a two-year review of more than 1,000 studies examining whether it’s possible to determine emotion through facial expressions. Their conclusion? It won’t happen unless the scientific community stops ignoring “substantial, meaningful variability attributable to context.”

“Technology will never be able to read consumers like a book,” said Northeastern University Psychology Professor Lisa Feldman Barrett, who co-authored the report and is also the author of How Emotions Are Made: The Secret Life of the Brain. “Technology at its best will do what people do with each other, and that is to infer the meaning of signals with some degree of confidence. But even that will require a major change in how emotion research is done. Scientists have to stop looking for simple read-outs of emotion and start measuring and modeling variation in how people express emotion.”

Mihkel Jäätma, CEO of Realeyes, a London-based firm that partners with brands including AT&T, Mars, Hershey’s, and Coca-Cola to understand human attention and emotions to improve consumer experience, is more optimistic.

“Science has proven you can accurately measure emotions from facial expressions, even though all emotions may not surface on the face,” Jäätma told CMO by Adobe.

Jäätma also noted inferences about emotion are context-dependent, benefit from multiple sources of information, and that individual differences and culture matter. Fortifying facial measurement with these contextual attributes, along with additional signals such as voice, can lead to greater predictive insights for advertising purposes, he said.

For example, having this data is already helping advertisers understand how different creatives drive emotions across different customer groups and how those emotions lead to better experiences and loyalty. It’s also proving useful for gaining insight into how user experience within mobile, gaming, or streaming apps can influence emotion, enjoyment, and even well-being.

“Our emotion classifiers, which use billions and billions of emotional data points collected from opt-in participants’ laptops, smartphones, and tablets from all over the world, are each checked by human annotators for accuracy and interpretability,” Jäätma said. “We are approaching human-level precision and judgment at scale.”

Accounting For Privacy Implications

Of course, getting individuals to allow brands to track their emotions could be a challenge. Privacy concerns and laws are growing, especially in Europe with its General Data Protection Regulation (GDPR).

“As with any personalization or tailoring effort, advertisers and marketers will need to consider what level of opt-in or opt-out they need to provide and how customer data will be protected,” Gartner’s LaRocca-Cerrone told CMO by Adobe. “To do that, it will be important to clearly understand your goals for the use of AEI. Those goals will shape which AEI methodology you pursue, how you will apply it, how you will measure it, and what kind of customer data you will need to capture.”

Ultimately, LaRocca-Cerrone said, AEI will help marketers achieve greater emotional resonance in their work and deliver more tailored experiences for customers.

“But AEI-informed advertising will fail if marketers don’t keep the customer at the forefront of everything they do,” she said. “AEI should be a supplement–not a substitute–for customer understanding. Marketers must stay focused on how AEI helps customers. It can’t become just another tool to induce them to make purchases. Rather, it should improve brand service offerings and experiences.”

Deixe um comentário

Your email address will not be published.