In an effort to improve the ability of robots to fulfil these functions, researchers have been working on expanding their “understanding” of humans. A team at Case Western Reserve University have been working on improving the AI that currently powers interactive video games to correctly identify human emotions from facial expressions, while another team has been using human movement to teach robots how to identify emotions.
The first team has achieved remarkable results, with their AI correctly identifying human emotions 98% of the time, almost instantly. Previous results from other researchers had achieved similar results, but the robots often responded too slowly. Combining two pre-processing video filters to another pair of existing programs to help the robot classify emotions based on more than 3500 variations in human facial expression allowed the Case Western Reserve to accelerate the response time.
The other study, conducted by researchers from Warwick Business School, University of Plymouth, Donders Centre for Cognition at Radboud University in the Netherlands, and the Bristol Robotics Lab at the University of the West of England, found that robots can learn to use movements, in addition to facial expressions and tone of voice, to recognise human emotions.
This team filmed pairs of children playing with a robot and a computer built into a table with a touchscreen top. The videos were shown to 284 study participants, who were asked to decide whether the children were excited, bored, or sad. They were also asked if the children were co-operating, competing, or if one of the children had assumed a dominant role in the relationship.
Only some of the participants watched the original videos. A second group saw the footage reduced to “stickman” figures that showed exactly the same movements. Members of both groups agreed on the same emotional label for the children, more often than would be expected if they were guessing. The researchers then trained a machine-learning algorithm to label the clips, identifying the type of social interaction, the emotions on display, and the strength of each child’s internal state, allowing it to compare which child felt more sad or excited.
Both teams of researchers believe that the potential applications of “social” robots are huge. While the technology is still in its early stages, more emotionally perceptive machines will be able to detect changes in a person's health or mental state in the future. For example, robots trained in identifying human emotion will be able to notice significant changes in a person through daily interaction and determine if a child or an elderly person is in distress and needs help, or even detect early signs of depression.
However, developing emotional intelligence in robots is a difficult task, requiring a combination of computer “vision” to interpret objects and people and software that can respond accordingly. With the goal being an understanding of what the human is feeling, psychology and trust come into play just as much as the technology. The findings from the research teams working on teaching AI to identify emotion prove that while the undertaking is challenging, it’s far from impossible.