top of page
  • NETMEDIA International

Artificial intelligence at the service of sales: detecting emotions

80% is the emotional part that guides our purchase decision, reducing the rational part to 20%. Faced with this observation, what strategy would be best suited ?

For online shopping alone, according to an AOL study conducted with InsightsNow and Freed Vance Research Group, the emotional component amounts to 66%.

Artificial intelligence: the "eyes and ears" of our emotions

For consumers, an advertising message that raises their emotional awareness will definitely have more impact than a classic advertising message. The advertising campaign led by Intermarché "C'est magnifique! "is a good example to illustrate this point. Focusing on key values such as love, family or friends, it brings the consumer back to his most intimate emotions. Thus, Intermarché does not focus here simply on the sale of its products (rational) but more on the desire to please, via the products they make available (emotional).

In 2016, Facebook, thanks to its emojis: joy, sadness, astonishment, anger, laughter, was able to better know and understand its users in view of the different contents on which they reacted and, thus, to offer them in a personalized way, an adapted news feed.

In 2020, we can now rely on AI to detect emotions whose collected data is called "feel data" in four ways :

  • Facial recognition.

  • Behavioural recognition.

  • Speech recognition.

  • Text recognition.

Thanks to this data, AI solutions are able to create consistent and personalized paths, without human intervention.

What solutions already exist to adapt to these emotionally driven behaviors?

Once again, start-ups are leading the way in designing technologically advanced and high-performance innovations in this area.

Facial and behavioral recognition

iMotion's solution assimilates and aggregates device data (eye tracking, facial expression), which will have been used to measure user emotions and behavior. iMotions conducted a study for SPARK, which had partnered with Ford and Rovio, the developer of Angry Birds.

Spark was looking for a new way to keep mobile advertisers in the video game while some of them wondered if players had actually seen the ads displayed during the game and had clicked on them intentionally or by accident. SPARK, Ford and Rovio partnered to develop, test and scientifically validate new ad formats that would better capture customers' visual, cognitive and emotional attention and restore advertisers' trust.

iMotions also worked jointly with HeyHuman, a behavioral communication agency. Their goal: to rethink brands and marketing because consumers are constantly assailed by digital content, it is difficult to analyze how they actually interact with it. By measuring their behavior during ad testing, UX research, concept development and more, neuroscience consultant Aoife McGuinness has been able to provide more nuanced information and create more engaging content for her clients.

French start-up Datakalab, for example, uses facial coding technology to analyze the emotions of consumers when faced with an advertisement, product or service. In particular, it has worked with Disney to create an emotional timeline by measuring, through facial coding, the emotional engagement of the audience during the viewing of the trailer of the movie The Call of the Antarctic. Datakalab was thus able to identify the passages generating the most emotional engagement among the audience. This start-up is also working on emotional mapping for the SNCF, for example. Thanks to a connected bracelet, it was able to measure the quality of the moment spent in the station and in different spaces and identify the places that generate the highest sensations for users.

Voice recognition

Thanks to its technology, Empath is now able to identify in real time the emotion expressed by a human voice, regardless of language. Based on tens of thousands of voice samples, the algorithm detects four emotions: joy, calm, anger and pain, as well as an energy signature. This is done by analyzing the physical properties of the voice: tone, pitch, speed of speech... Empath has already proven itself in telemarketing: 20% increase in voice sales. This solution, which improves the accuracy of human behavior recognition systems, can be used in the automotive industry to locate a driver who is struggling with obesity, for example, as well as in the IoT sector (intelligent speakers, call centers...), robotics, psychiatric care...

Text recognition

Since 2016, Q°emotion has been improving customer relations by analyzing texts from several channels: social networks, e-mails, surveys... It has worked together with Corsair in order to improve each key step of the travel experience to enhance the passenger experience and develop loyalty. Using Q°emotion, it was possible to perform emotional analysis of customers' opinions after their trip and thus detect the journey's annoyances and the levers of enchantment. The study was a success, as they increased the NPS[2] by 15 points and reduced the churn[3] by 5%.

As for Comongo, it allows you to carry out a group focus in a digital way. The user has to decide which population he wants to investigate: customers, employees, prescribers... Then what he wants to evaluate: product, service, campaign... Then, a form is generated and sent to the group by Comongo. And finally, the solution will analyze the results of the verbatims to bring out the feelings of the respondents. Interested, My Pro, a communication agency that targets SMEs and large companies located on the Alpine arc, wondered about its brand image, in view of a merger with another company. By carrying out the study with Comongo, she was able to sweep away beliefs and received ideas from the company's management, without any point of disagreement, because the data spoke for itself.

According to Crone Consulting, the emotional analysis market is expected to reach $10 billion worldwide in 2020, compared to $20 million in 2015. The analysis market is booming and many solutions already exist. But while it looks promising, we should not overlook some of the difficulties it could encounter. For example, facial recognition could struggle in the face of the diversity of emotional expression depending on a country's culture. For example, in Japan, being lucky or calculating is an emotion, whereas in Europe it is considered a quality or a flaw. In Tahiti, sadness does not exist; Tahitians express their sadness through a state of weariness. Could a machine one day interpret and understand this type of specificity?

5 views0 comments


bottom of page