Latest Telepresence and Visual Collaboration News:
The future of 3D video conferencing?
March 30, 2012 | Hogan Keyser
March 27, 2012 via 3DFocus.co.uk -- Part of the Research Strand during the Ravensbourne 3D Storytelling event, the trial set out to provide a public glimpse of what video conferencing could be like in the near future. The audience in Tomsk, Russia were captured by a Panasonic AG-3DA1 camera and the London audience were captured with two Panasonic P2 cameras installed on a 3D Factory mirror rig.
Each side-by-side stereo feed was converted into a 2D + Depth Map (2D+Z) video stream in real-time by Triaxes NetJet software which enables stereo content to be displayed on autostereoscopic lenticular screens. The HD video and accompanying depth maps were transmitted in both directions, over 4000 miles to London/Tomsk. At each end, a Triaxes 3D Media Player received the IP broadcast, de-multiplexed and decoded the audio/video streams, added special control information (3D effect settings) and then sent the 2D+Z feed to a 42" Dimenco glasses free 3D display in London and a 55" display in Tomsk. The Rendering Cores integrated into the Dimenco displays then calculated the data and rendered out 28 viewpoints from the original 2D+Z stream.
The video was transmitted as a MPEG TS (transport stream) - the method commonly used in TV broadcasting.
Add New Comment
Telepresence Options welcomes your comments! You may comment using your name and email (which will not be displayed), or you may connect with your Twitter, Facebook, Google+, or DISQUS account.
18 January 2017 13 January 2017 12 January 2017
17 January 2017 12 January 2017 6 January 2017
See what happens when YouTube and TPO come together at the Telepresence Options YouTube Channel.