Empathetic Tech: EQ for our AIs?

Original article was published by Kingston Celine on Artificial Intelligence on Medium

What is Empathetic Technology?

Empathetic technology is “technology that is using our internal state to decide how it will respond and make decisions,” says Poppy Crum the Chief Scientist at Dolby Laboratories and an Adjunct Professor at Stanford University. Today our wearables and other IOT tech can pick up on our routines, habits, and idiosyncrasies. Many times imperceptibly to us and then adjusts accordingly. An apt example is Apple’s watchOS7 which can detect hand washing motions and sounds and initiates a 20 second countdown timer and if the user finishes early, they will be prompted to complete the countdown. The Apple watch will also recognize when you have returned home and prompt you to wash your hands.

This example, however, is child’s play compared to where empathic tech is going. There are already several neuropsychological “tells” that tech can pick up on. Some hardware can pick up on the dilation of our pupils, for example, to track cognitive processes like cognitive overload (struggling to understand something), or read our stress and anxiety levels based on sweat secretions. Switch over to AIs and there are several applications in what we call affective computing that can now be used to detect our emotions based on our facial expressions, voices, body gesture, and language usage. We have several players in this space including Microsoft’s Emotion AP which receives picture and video and first identifies faces and subsequently the emotions expressed on those faces.

The tech isn’t perfect and the hard part about implementing any of it is that by design human interactions and emotions are multimodal. We send out a lot of information across different channels. The combination of our body language, face expressions, and tone of voice can send out an entirely different message than if we were to take any of those characteristics in isolation. Combining and “translating” all these different messages into one cohesive sentiment is where affective computing continues to struggle.