Say hello!    Twitter  LinkedIn  Facebook  Instagram  Flipboard    
Behavioral Signals: AI that predicts if you’re going to buy from the emotion in your voice

Behavioral Signals: AI that predicts if you’re going to buy from the emotion in your voice

Robert Scammell, who writes about AI and Automation at VERDICT (UK), interviewed our CEO Rana Gujral, at the AI Everything Conference in Dubai, regarding behavioral prediction and how Behavioral Signals achieves it. His title: Behavioral Signals: AI that predicts if you’re going to buy from the emotion in your voice

Imagine knowing a person’s intention to buy a product with more than 80% accuracy, solely from the emotion in their voice. That is the promise of Behavioral Signals, an artificial intelligence (AI) startup based in Los Angeles, US.

“We think of ourselves as the Tensorflow of emotion engine,” says Gujral, who describes his company offering as “emotion as-a-service”.

“Any place where there’s a human to machine interaction, we have the ability to have the machine be more emotionally aware, and be more in tune with how the human is feeling at a certain point,” he says.

“The engine can deduce in real-time who is speaking, when certain words are spoken and how fast the person is speaking and who is speaking most and who is speaking least etc.”

With great power comes great responsibility

However, as with any technology, Behavioral Signals’ emotion engine comes with social responsibility.

“You could misuse this technology quite a bit,” says Gujral. He explains how one “large company” approached Behavioral Signals for a very specific use case for “social good”.

It involved using Behavioral Signals’ emotion engine to detect the abuse of children in the home or elderly people in care homes.

“So they want to build this device that is in your home, is always listening – is listening to everything. And it’s passing out specific signals of distress and duress and abuse. And, based on that, certain actions will be taken: alerts, [the] authorities might be called, etc.”

Gujral describes it as a “phenomenal use case”, but as with many good intentions, it comes fraught with problems.

As well as the privacy infringements, it also opens up the possibility of predicting crimes before they have been committed based on intent, reminiscent of the ‘Minority Report’.

“You could apply that into sneaking into people’s lives, police can monitor you, big government can get into their homes and start to understand and take action based on your intent.”

Read the full interview on Verdict

Sign up for our bi-monthly EMOTION AI newsletter (brief, we promise)