The work is being done by the Feelix Growing project, which involves six countries and 25 robotics experts, development psychologists, and neuroscientists. Along with the software, the project is developing "neural networks" of cameras and sensors that help the robots detect a person's facial expressions, voice, proximity, and other parameters to determine emotional state, according to ICT Results, a news service created to showcase European Union-funded research.
Researchers in the 3-year-old project are building demonstration robots as proofs of concept. One of the demos follows researchers around, learning from experience when to trail behind or stick close. Researchers also are working on a robot face that can simulate emotional expressions.
"It's mostly behavioral and contact feedback," project coordinator Lola Canamero said in telling the BBC News how the robots learn. "Tactile feedback and emotional feedback through positive reinforcement, such as kind words, nice behavior, or helping the robot do something if it is stuck."
In the future, researchers hope the robots can be taught to differentiate when a person cries out in pain, in anger, or in happiness and respond accordingly. The ideal behind the research is that if robots can adapt to people's behavior, they can play a bigger role in society as domestic helpers or in helping the sick, the elderly, people with autism, or house-bound people, Canamero said.
The technology under development combines research in robotics, adaptive systems, developmental and comparative psychology, neuroscience, and ethology, which is the study of human and other animal behavior. While the project is a long way from developing human-like robots portrayed in the popular movie "I Robot," starring Will Smith, the Feelix Growing work is seen as a big step forward for robotics.