Teaching Computers to Understand Human Feelings
The importance of human-machine interaction in our daily life and its technological challenges were the theme of the recent Forbes article that featured Behavioral Signals. Yiannis Mouratidis interviewed him on teaching computers to understand human feelings.
According to Rana Gujral, our CEO, a critical feature that differentiates Behavioral Signals from its competitors is our algorithms’ capability to predict human emotion based on the analysis of the digitized voice data. Most other companies have developed emotion engines which target vertical markets whereas Behavioral Signals’ technological capabilities span across all verticals.
The technological interaction between humans and machines has often been described as dysfunctional since people feel awkward then they communicate with an impersonal bot. According to recent research by the Yale School of Management, the accuracy of human emotion recognition decreases when the analysis is based on voice and image data. Consequently, if Behavioral Signals’ voice-based solution were applied to a robot, then it would be safer to process only voice data to record human emotions and get better results.
Read more on how computers are beginning to learn to understand human emotions here.