Robot, who knows how you are doing
It is not always that easy to tell in what mood the people close to you are, therefore, a robot that can do just that might sound a bit like science fiction.
< Photo: Jukka Konttinen.
“One of our goals is that from facial expressions the robot can interpret the basic emotions such as sadness, fear, anger, excitement, amazement and loathing and adjust its actions accordingly,” says professor Matti Pietikäinen from the University of Oulu. He is involved in the project that studies and develops affective human-robot interaction.
The project links latest applications from the fields of machine vision, speech animation and navigation that enable the robot of the future to recognise the spontaneous expressions and gestures of people, understand speech and react to it.
The robot observes its surroundings and at the same time its sensors produce information according to which it can move around on its own.
The robot recalls previous interaction
The robot’s machine vision uses normal video camera and Kinect range camera. The microphones installed to it help to recognise the volume and direction of the speech.
“The speech recognition module is connected to the talking head that can synthesise speech and lip movement. The answers the robot gives are based on its internal database and external information acquired from the Internet,” Pietikäinen explains.
In addition, the robot learns to recognise different people and can remember the previous interactions with them. It compares the data it receives to the emotion and interaction models it has been taught, and based on this information it can communicate through sound and movement.
What is it for?
The aim is to achieve human–robot interaction that resembles the interaction between people as closely as possible. However, the social robot won’t move to every household in the next few years but it has been estimated that it will enter consumer markets within ten to fifteen years.
The applications of the robot can be used in various fields such as elderly and health care, security and logistics.
It can also be used, for example, to study the effects of entertainment by observing the viewer reactions to commercials and films that portray strong emotions.
The project is part of Academy of Finland’s Ubiquitous computing and diversity of communication programme. The human–robot interaction is studied at Infotech Oulu in collaboration with Chinese universities among other partners.