Gold Sponsors
Silver Sponsors
Bronze Sponsors
Telepresence Options Magazine
telepresence options catalog ad

Latest Telepresence and Visual Collaboration News:
Full Article:

Holographic Telepresence Breakthrough Announced at U of A - With Telepresence Options Publisher Howard Lichtman's Thoughts and Analysis

November 4, 2010 | Howard Lichtman
Nasser_Peyghambarian_UofA_holography.jpg
University of Arizona scientist Nasser Peyghambarian
stands in front of his rapidly refreshing holographic display

Hundreds of publications including the Financial Times of London, the Drudge Report, Wired, and others are reporting on a recent announcement by scientists at the University of Arizona's College of Optical Sciences led by Nasser Peyghambarian that they have developed a 17inch, rapidly refreshing holographic display with the future potential to be a telepresence display technology.  Here is my overview of what has been announced and what it means to the future of telepresence. 


What has the University of Arizona Team Announced

Here is a summary from the U of AZ News:

"The prototype device uses a 10-inch screen, but Peyghambarian's group is already successfully testing a much larger version with a 17-inch screen. The image is recorded using an array of regular cameras, each of which views the object from a different perspective. The more cameras that are used, the more refined the final holographic presentation will appear.

That information is then encoded onto a fast-pulsed laser beam, which interferes with another beam that serves as a reference. The resulting interference pattern is written into the photorefractive polymer, creating and storing the image. Each laser pulse records an individual "hogel" in the polymer. A hogel (short for holographic pixel) is the three-dimensional version of a pixel, the basic units that make up the picture.

Holographic_Telepresence_Visualization.jpg



The hologram fades away by natural dark decay after a couple of minutes or seconds depending on experimental parameters. Or it can be erased by recording a new 3D image, creating a new diffraction structure and deleting the old pattern."



The breakthrough here is that the image is refreshing every two seconds which is a 100 fold improvement from the team's last major announcement.  It is a far from the 30 to 60 frames per second that conventional telepresence and videoconferencing systems achieve today albeit in 2D. 30 frames per second is generally considered to be the threshold for fluid, natural motion.  

Wired had a great quote from optical scientist Michael Bove of the MIT Media Lab, who was not involved in the new research but is collaborating with Peyghambarian on another project. "This is mostly a materials advance, the material is faster and more sensitive than what had previously been reported."

Given the small size of the screen and the two-second lag time, "some people in the field object to the term 'telepresence,'" Bove said.


Telepresence Options Publisher Howard Lichtman's
Thoughts and Analysis 


The University of Arizona announcement is, without a doubt, an important step in the development of holographic projection and my congratulations goes out to the folks at the University of Arizona College of Optical Sciences and Nitto Dinko, who developed the holographic polymer.

The "Holy Grail" of telepresence conferencing is the ability to project photo-realistic 3D representations of remote participants into a physical space where they can be interacted with in real-time.  This requires capturing the image of the remote participants in three dimensions, compressing that information, sending it across a network, decompressing, and then projecting that representation as a three-dimensional photo-realistic image that appears solid. 

The procedure is even more complicated because to have a natural, humanistic interaction with the projected remote participant one must align the local camera capturing the image to provide eye-contact or the approximation of eye-contact to satisfy the innate expectations that human beings have with respect to inter-personal communications.  Existing telepresence systems and environments do this by precisely positioning the participants with respect to the screen and camera to achieve the right eye-line and aspect ratio with remote participants. 

The methodology described by U of A requires an array of cameras capturing the participant from multiple angles and transmitting the image to a remote display where the captured image is displayed on a static polymer display.  This appears to presuppose that participants will be in a fixed space looking at a fixed polymer display with no major movement of participants.

Hey Media... Guess What: The telepresence Industry is delivering essentially the exact same format and aspect ratio today with photo-realistic, high-definition 2D images.  Some augmented reality solutions where the images are projected on to a pane of silvered glass or clear polymer appear to "float" in mid-air and probably 90% of uneducated observers believe the images to be 3D. So essentially, since U of A's version of holographic telepresence will still require fixed capture and a fixed display negating the ability to move around and view the projected image from other angles, you are looking at dramatically more cost and complexity for what will, potentially, be a tiny incremental improvement in realism IF the ultimate laser and polymer combination can produce the same quality high-resolution color images as DLP/LED/OLED projection does today with 2D AND at an acceptable cost in equipment and bandwidth for mass adoption. 

The unreported story in the media is that the telepresence industry is not only delivering this level of quality today but every major input for delivering these solutions is both improving in quality (greater resolutions, larger size displays, brighter LED and DLP projectors, higher quality QoS bandwidth) and the same time that the major cost components are dropping like a rock.  in addition, the utility of telepresence (the number of organizations and individuals you can reach, the available content, etc.) is headed to through the roof.

My complete SWAG on the reality of holographic telepresence is 10-15+ years before holographic telepresence might be able to deliver the same end-user acceptance/preference that 2D systems are achieving today. 

However, we are 3-7+ years (another SWAG) away from making low-cost, high-quality 2D telepresence common in business and academia with the ability to connect seamlessly to other organizations globally and that is going to revolutionize communications, learning, and the world.  In fact, the process has already begun!

  "The Future is Here.  It Just isn't Evenly Distributed Yet"
- William Gibson

Here are some examples of existing telepresence systems with seamless displays and high-definition photo-realistic images where the projected images appear 3D to lay observers.

The Digital Video Enterprise Immersion Room powered by a LifeSize Room 220 Codec


Cisco CEO John Chambers talks with SVP Marthin DeBeer who is appearing lifesize in India using Musion's Eyeliner On-Stage Telepresence Solution and a Cisco codec.

The TelePresence Tech 3D TelePresence Room (Since Renamed the TPT 4000)




The Polycom RPX 400 Series has a 16 foot by 4 foot video wall and hidden cameras


About the Author

HSL_Headshot.jpg

Howard Lichtman is the President of the Human Productivity Lab, an independent consultancy focused on telepresence and effective visual collaboration for organizations looking to improve productivity and reduce costs.  The Lab provides corporate clients with acquisition consulting, RFI/RFP creation, and ROI/TCO financial modeling on telepresence systems, telepresence managed services, and inter-networking telepresence. The Lab also provides investors with prescient insight into the rapidly growing telepresence industry.  Mr. Lichtman is also the publisher of Telepresence Options, the #1 website on the internet covering telepresence technologies and the Editor of the Telepresence Options Telegraph.







Add New Comment

Telepresence Options welcomes your comments! You may comment using your name and email (which will not be displayed), or you may connect with your Twitter, Facebook, Google+, or DISQUS account.