Latest Telepresence and Visual Collaboration News:
Sony demos game controller to track motion and emotion
Video: Hands-free control
By Colin Barras, New Scientist
The latest games console arms race - to perfect hands-free, full-body game control - just got more competitive.
Sony has unveiled just such a system called Interactive Communication Unit or ICU, at the Vision 2009 trade fair in Stuttgart, Germany. It uses stereo cameras to watch a player and, like a pair of eyes, to judge depth.
Microsoft unveiled its own full body controller in the summer summer, Project Natal, due to be released for the Xbox 360 games console late in 2010.
Like Natal, Sony's system tracks a person's whole body without their having to wear the body markers used in motion-capture studios. Also like Natal, Sony says ICU can detect a player's emotions by watching their facial expressions, and can judge sex and approximate age from their appearance.
Sony Europe's image-sensing division created ICU in collaboration with Atracsys, a small firm in Lausanne, Switzerland, that specialises in optical tracking.
Atracsys already sells a system that gives medics hands-free control of computers in sterile environments, called Infinitrack. But its users have to wear small reflective markers like those used in a movie industry motion-capture studio; previous versions required users to wear particular colours.
Casual users can't be expected to do that, says Gaëtan Marti, CEO of Atracsys, which limits the system's precision.
"We cannot at present detect 'finger signs' [but] we can detect where you are looking at on the screen - up, middle, down - and the raw position of your arms [or legs]," he says.
ICU's stereo cameras can detect the position of specific points on the arms, legs and head to within 10 cubic centimetres, compared with the 0.2 cubic millimetre accuracy of Infinitrack.
ICU 'reads' facial expressions using a pattern-matching algorithm that has been trained on pictures of people expressing different emotions. Using cues such as the position and shape of the lips, ICU spots five basic states: happiness, anger, surprise, sadness and neutral.
It also has the ability to tune out the visual clutter around a player that could otherwise distort its results. "Once it detects a face 2 metres in front of the cameras, the system can isolate the person by only keeping the information between 1.5 and 2.5 metres away," Marti says.
Sophisticated as it is, however, ICU isn't yet going to be launched into the punishing domestic entertainment market, says Arnaud Destruels, marketing manager at Sony's image-sensing division.
"It's clear that if the consumer has a bad experience with the technology they could reject it without giving it a second chance."
Instead ICU is going to be launched first into the world of advertising, which will be its training ground, says Destruels. Interactive shop windows and billboards will provide a chance to iron out wrinkles in the system and to familiarise people with the concept, he says.
Christian Theobalt, at the Max Planck Institute for Informatics in Saarbrücken, Germany, agrees that people won't be forgiving of a novel interface's failings. "For the consumer to accept this technology it has to work robustly in real time."
Last year Theobalt developed a markerless motion-capture system that uses eight cameras and can track even the sway of clothing. But its footage has to be processed after the fact, a luxury ICU doesn't have.
"If real-time performance is the goal, one has to reduce complexity, which reduces the accuracy one achieves," he explains. Sony will have to balance aiming for complex gesture recognition with the need for dependable performance.
[via New Scientist]
Add New Comment
Telepresence Options welcomes your comments! You may comment using your name and email (which will not be displayed), or you may connect with your Twitter, Facebook, Google+, or DISQUS account.
See what happens when YouTube and TPO come together at the Telepresence Options YouTube Channel.