How brands are using emotion-detection technology


Are you in touch with your emotions?

Well, your favourite brand might soon be, all thanks to emotion detection and recognition technology – a market that’s predicted to grow to $65 billion by 2023.

This booming market is being fuelled by advances in artificial intelligence, and big investment from organisations (across a range of industries) keen to study consumer behaviour. It involves the analysis of facial signals to determine internal emotions, including nuanced expression and body language.

In theory, this information can be used to then trigger or change decision-making – or simply create a more emotive or personalised experience.

So, what exactly are brands doing, and why? And what about the matter of privacy concerns? Let’s discuss.

Disney & market research

Market research within the film industry is usually qualitative, with data being manually collated from surveys, reviews, and post-screening responses.

Disney, however, has been using technology to determine how audiences enjoy its movies, specifically creating an AI-powered algorithm that can recognise complex facial expressions and even predict upcoming emotions.

The software – which involves a method called ‘factorised variational auto encoders’ (FVAE) – captured people’s faces using infrared cameras during movie screenings including ‘The Jungle Book’ and ‘Star Wars: The Force Awakens’.

After just a few minutes of tracking facial behaviour, the algorithm was advanced enough to be able to predict when they would smile or laugh (in relation to specific moments in the movies).

It’s not just what the technology does that makes it impressive, but at what scale. According to reports, the tests generated 16 million data points derived from 3,179 viewers – far more than human intelligence could process.

In theory, this also creates a more accurate picture of audience response. Standard surveys and reviews could be coloured by context, such as people not wanting to be overly negative, or forgetting how they really felt in specific moments.

In contrast, AI-powered tech creates real-time and reliable data, allowing Disney greater insight into what provokes (the desired) emotion.

Kellogg’s & digital ads

While Disney uses emotion-detection tech to find out opinion on a completed project, other brands have used it to directly inform advertising and digital marketing.

READ ALSO  the importance of first-party data and lookalike audiences – Econsultancy

Affectiva is one of the most prominent technology developers in this area, with its emotion AI being used by a number of Fortune 500 brands.

Kellogg’s is just one high-profile example, having used Affectiva’s software to test audience reaction to ads for its cereal.

It showed consumers multiple versions of an ad, eventually concluding that while one version produced the most laughs during an initial watch, it produced little engagement on the second.

As a result, Kellogg’s chose the version that generated steadier levels of engagement over the course of multiple views.

This type of data-driven marketing can be invaluable for brands like Kellogg’s, but of course there are privacy issues involved. It’s not just about invasion of privacy (with brands having access to personal information), but due to the emotional nature of the data – how brands could use it to potentially manipulate or exploit consumers.

Unsurprisingly, organisations like Affectiva are wary of this, highlighting that it only uses personal data with consent, and that data can be deleted on request at any time.

Unilever & candidate interviews

Brands aren’t just using emotion-detection technology for consumer-facing purposes. Some are also utilising it internally, specifically when it comes to hiring new employees.

Unilever does this, using HireVue’s AI-powered technology to screen prospective candidates based on factors like body language and mood. In doing so, the company is able to find the person whose personality and characteristics are best suited to the job.

By analysing body language and facial expressions, the algorithm can predict how the candidate might react or behave in certain situations. It can also detect whether a candidate might be lying, as well as their general confidence levels based on how emotions change during responses.

Unilever is then able to determine who has the traits required for the job easily, ultimately taking away lengthy hiring processes.

Nevermind & altered video game play

Video games are designed to evoke a series of emotions in players, with these reactions often being intrinsic to overall enjoyment of the game. Previously, however, creators would have to create a blanket experience for all players, regardless of individual tastes, preferences, and unique responses.

READ ALSO  A day in the life of... an AI language technician – Econsultancy

Now, thanks to recognition technology, developers are integrating emotion-detection into the user experience in order to adjust or tweak sequences in real-time.

The thriller video game, Nevermind, does this, using a webcam to monitor the player’s facial expressions. From this, it then alters gameplay, perhaps increasing difficulty or lowering it based on the level of fear or anxiety.

As well as creating a more personalised experience, there’s an additional benefit to this kind of technology: teaching players how to control their own emotions, both in the game and in a real-life context.

If the game becomes easier when they are calm, for instance, they are likely to learn this over time and begin to control their own stress levels (and in turn, the game). This ability also removes points of frustration, helping to increase session time and create a more immersive experience.

As creator Erin Reynolds told The Week, there is also less of a privacy concern here too, with Nevermind using data purely to enhance the user experience for consenting players (and not for further targeting).

What does the future hold?

The opportunities afforded by this technology extend further than digital advertising and market research.

Another industry that could capitalise on it is healthcare, with AI-powered detection software potentially helping to determine when patients need medicine or to help doctors prioritise who to see.

Another is the automotive industry, as emotion-detection software in cars can enhance the overall user experience while also improving car safety. The latter is a particularly interesting area, and one that Affectiva has already made strides in researching and testing.

From detecting distracted drivers to monitoring levels of drowsiness, this type of data could be hugely invaluable for both brands and consumers, surely making it only a matter of time before it is integrated into vehicles.

Are biometrics the future of UX testing and CRO?



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com