At the CeBIT 2006 exhibition in Hannover, researchers from Frauenhofer Institute for Computer Graphics Research IGD in Rostock, Germany, are expected to present techniques that could enable a computer or robot to respond to the mood of its human master.
Humans display emotions in a variety of ways including many which a computer can detect, including posture, fidgeting and facial expressions such as smiling and frowning. These things can be observed and classified by a camera with image analysis software. Heartbeat, breathing rate, blood pressure, skin temperature and electrical resistance of the skin are more subtle factors.
“We have developed a glove that has sensors for measuring parameters like these,” said Christian Peter, engineer at the department for Human-Centered Interaction Technologies, in a statement. “It is connected to a device that evaluates and saves the data. We are also working on techniques that will enable computers to interpret facial expressions and extract emotional elements from voice signals,” he added.
Interpreting all the data is difficult, since emotions are by their very nature ambiguous, transient and hard to describe. Frauenhofer said that at present its method can only work if the user trains the computer in advance.