What We Believe About Amazon’s Emotion Detection Wearable
Fortune writer, Don Reisinger, did a follow up article on Bloomberg’s post regarding Amazon’s new wearable that will measure emotions. Reisinger asks “What makes Amazon think it can do a good job of understanding our emotions, when we often struggle to identify them, ourselves?”
So he asked Nielsen and our CTO, Alexandros Potamianos, what we believe about Amazon’s emotion detecting wearable, if it can work, and what are the privacy issues: “What do you think of the concept of a wearable evaluating human emotions and sending ads based on that? Can it work? Are there privacy implications? Does the plan make sense?”
Here at Behavioral Signals we strongly believe that emotion recognition from voice can be achieved with high accuracy (that’s what we do) and the results can actually improve our lives. It’s not as simple as it sounds and it needs deep knowledge of natural language understanding, piles of data, and machine learning scientists to achieve these results. As for how an integrator decides to use the outputs in their own applications that is an interesting conversation. While Amazon might decide to use the results of its emotion detection to push ads, what we argue is that the results can actually be useful even for this kind of application, if done correctly. With empathy. It’s a matter of using these emotional recognition analytics in a way that can be beneficial for a human.
So, head over to the article but just for clarity here’s our response to the journalist’s question:
Capturing human emotions with any sort of medium obviously is something we strongly support here at Behavioral Signals. We’re responsible for measuring, classifying emotions and behaviours, and even predicting them; we dream of a future with AI technology that will incorporate empathy. As for how the technology will be utilised is completely up to the integrator and we can definitely see potential in ads if they are appropriately sensitive and aware of the person’s feelings. Suggesting to a person, feeling stressed, a good podcast to listen to, or a book to read, might not be a bad idea.
Emotion recognition in voice is a fairly new technology but yes, it can work, and can prove a useful tool in the relationship between humans and machines. As for privacy, every new technology gives rise to fears, but we believe regulators will eventually step in to protect the consumer. Emotions are not unique in themselves; we all feel fear, joy, excitement, sadness, stress etc. Emotional awareness is very important, providing valuable insights that can help us improve our lives. Think of the possibilities for people with mental health issues or elderly people in distress.