News
News
8/5/2005
02:04 PM
50%
50%

Emerging Graphics Technology Showcased

Last week’s Siggraph conference highlighted applications ranging from NASA’s Mars landing to facial emotions used in sports cameras and videoconferencing.

LOS ANGELES — Star Wars and NASA's Mars rover landings have one thing in common: killer graphics. The two worlds they represent, of fantasy and fact, came together here at the annual Siggraph conference.

Siggraph's popular Emerging Technologies pavilion offered a taste of how computer graphics and imaging will one day be used in interfaces, visualization and the presentation of content.

Presentations from Japanese teams explored the use of graphics technology to represent facial emotions and techniques for embedding cameras into a moving ball, offering a whole new angle on baseball or other sports. A team at Chiba University in Japan detailed the Color-Enhanced Emotion system, which recognizes facial expressions in computer graphics content and controls skin-pigment components using a real-time processor to enhance them. The result is an "emotional facial expression."

"This is very important for Japanese people," said head researcher Toshiya Nakaguchi. "They tend to show little emotion in face-to-face meetings."

With video phones and video chat becoming more commonplace, the researchers said, it will be increasingly important to control image quality in a limited-bandwidth environment by applying emotion effects in real-time at a reasonable cost. The technique could also be applied to movie editing, the team maintained.

The Color-Enhanced Emotion system uses computer vision techniques to recognize feelings expressed in facial images, and then implements a hardware-accelerated real-time processing system to control the pigment components of the skin by replicating a broad range of conditions with color enhancements: fair, suntanned, pale, red-faced and so on. Accurate registration cameras decompose the surface reflection of the face to enhance it with the colors associated with commonly observed emotions.

Elsewhere, Andy Wilson at Microsoft Research wants to let users control objects in displays by movement and gestures. In a demonstration of the company's TouchLight technology, a transparent acrylic-plastic 4 x 3-foot board — actually an advanced optical-lens screen from dnpDenmark, a company near Copenhagen — was mounted vertically on a jig. Three off-the-shelf cameras and a projector were placed in the back. Like the futuristic displays in the film Minority Report, an otherwise normal-looking sheet of plastic was transformed into a high-bandwidth input/output surface suitable for gesture-based interaction.

"Our current goals include exploration of interaction techniques, signal-processing algorithms and artistic installations that are idiomatic to this configuration," said Wilson.

Previous
1 of 2
Next
Comment  | 
Print  | 
More Insights
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Tech Digest, Nov. 10, 2014
Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?
Video
Slideshows
Twitter Feed
InformationWeek Radio
Archived InformationWeek Radio
Join us for a roundup of the top stories on InformationWeek.com for the week of November 16, 2014.
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.