Behavioral Signal Processing for Home Assistants and Robotics

Researchers at SAIL have a long history of studying how humans interact; in fact, they were the first to look at how children with autism interact with robots. Through signal processing and machine learning on images, we were able to track positive and negative child-robot interactions; this work relied on measuring proxemics, i.e., the distance between the child and robot and location in the room. In another experiment, the uncertainty of children interacting with a virtual tutoring system was quantified based on their vocal expression. Today, approaches to modeling human-computer interaction (HCI) have found far-reaching applications in car vocal command systems, SIRI, Amazon Alexa, and Google Home. Behavioral Signals builds on this expertise to create emotional awareness for the next generation of home assistants and robots.

Whitepaper on Behavioral Signal Processing

Behavioral Signals Home Assistants and Robotics

Discover more on Behavioral Signals Processing in

Read more

Our privacy policy has been updated. You may find the updated policy here: https://behavioralsignals.com/privacy-policy/

X