Original article was published on artificial intelligence
When we think of robots, we often think of mechanical objects that repeatedly carry out an simple task or serve to the more basic roles in society and while media may have portrayed robots that could mimic human behaviour from movies such as Big hero 6 or Wall E, the idea of a robot not only interacting but understanding the nuances of human behaviour seemed almost impossible.
This is where Rana el Kaliouby comes in, an academic who studied at Cambridge and MIT, and spent her career tackling an increasingly important limitation of technology – that computers do not understand humans, became the co founder of a Boston based start up called Affectiva, and has been working in the dynamic field of Human Robot Interaction (HRI) for more than 20 years.
In a recent interview, Ms Kaliouby stated “Technology today has a lot of cognitive intelligence, or IQ, but no emotional intelligence, or EQ.” ,and goes on to say, “We are facing an empathy crisis. We need to redesign technology in a more human-centric way.”
While this isn’t a main concern of AI that performes data driven, logical tasks such as data processing, but it does become a bigger concern when the AI is in contact with clients, whether it be an AI receptionist or a robot driver. Increasingly, artificial intelligence is being used to directly have contact with humans. This demand has led to the emergence of Emotional AI, which aims to build trust in computers and artificial intelligence by improving how these technologies interact with humans. Emotional AI classifies and responds to human emotions by using a multitude of sources for its data, such as scanning eye movements, analysing voice levels and examining sentiments expressed in emails, and has had a significant uptake across many industries such as gaming, advertising, call centers and insurance. However, there are concerns that Emotional AI might have an adverse effect and further damage trust in technology if it is misused to manipulate consumers.
Affectiva isn’t the only company that has registered interest in this field of technology. Amazon recently filed for patents in emotion-detecting technology that could tell when a user is happy, angry, excited, stressed and would help their voice assistant Alexa to suggest music to suit their mood. Affectiva has in fact developed an vehicle AI system which is able to recognise when a driver is angry, distracted or drowsy and responds accordingly by tightening the seat belt or adjusting the temperature. If successful, these kinds of emotion recognition technologies would be highly valuable to companies as it would allow them to interact and engage with their customers on a deeper level. However the use of emotion data is also risky as it could be ambiguous and lead to fatal mistakes for the businesses that employ it. Other research institutes also stress that the emotional AI not be used in situations that require a high degree of human judgement such as interviewing someone, therapy or assessing pain.
But there are still many ways that the use of Emotional AI can have positive effects outside of the corporate world. In her latest book, Girl Decoded, Ms el Kaliouby explains how this technology can be an important tool for making technology more accessible to humans, and her academic research focused on how facial recognition technology could help autistic children interpret feelings. Yet she also stressed that in order to gain trust of technologies, the AI system should only be used with the consent of the user, who also should have the option to opt out.
Looking forward, the development of emotional artificial intelligence either has the potential to go totally wrong leading to worse outcomes for a business than if it was to use the original method of human interaction, or it works too well giving businesses the opportunity to exploit this technology and manipulate consumers. It is all up to the ones who supply these systems to regulate their use and ensure maximum welfare for consumers.
cover photo: https://blogs.ubc.ca/angryvietnamese/2016/02/17/revisiting-wall-e/