Latest Telepresence and Visual Collaboration News:
Watch This High Fidelity Avatar Animated by a Pixar Vet Imitate Real World Facial Movements in Near Real-Time
Philip Rosedale just sent me this new video demo of an avatar singing Christina Aguilera's "Beautiful" in High Fidelity, his Oculus Rift-compatible virtual world, and if you know all the art and technology behind it, you'll think it's pretty cool:
The singer is actually High Fidelity's Emily Donald (who has a lovely voice), and the avatar is imitating her actual face and lip movements as tracked via a PC camera pointed at her, and in near real time. The avatar herself sort of looks like a character in a Pixar movie, and that's no surprise: The facial animations were created by High Fidelity's Ozan Serim, who was a longtime manager at Pixar, before joining Philip's company. (Serim worked on Monsters University, Cars 2, Brave, and Toy Story 3 there.) The facial animations are more than enough to convey emotion, and the lip sync is just about perfect. (Bad lip sync remains a horrible problem in Second Life, not to mention other MMOs/machinima platforms.) As it happens, Philip and I were just e-mailing about how live music performance can be a compelling thing in virtual reality, so this video is a case study of that.
How was this shot, and what's the latency between her face movements and the avatar animations. Philip explains:
"This was done by Emily and Ozan (who is playing guitar) in our little back office here. Emily is in front of a Primesense Carmine camera, using Faceshift to detect animation." Philip adds: "Ozan has been doing amazing work designing avatar facial movements that map well to that sort of live camera, and this is the result."
Add New Comment
Telepresence Options welcomes your comments! You may comment using your name and email (which will not be displayed), or you may connect with your Twitter, Facebook, Google+, or DISQUS account.
See what happens when YouTube and TPO come together at the Telepresence Options YouTube Channel.