Latest Telepresence and Visual Collaboration News:
Apple Invention Automatically Adjusts A/V Settings Based On User Positioning And Environment
The U.S. Patent and Trademark Office on Thursday published an Apple patent application for a system that dynamically changes audio and video settings based on where a user is located in relation to the source device, thus allowing for the best viewing or listening experience.
Apple's patent application for "Devices with enhanced audio" covers not only speaker settings, but video adjustments as well. By using various sensors, such as cameras and microphones, the system is able provide an optimal experience with little to no input from the user.
The method starts with data collected from a large variety of sensors, including imaging sensors, proximity sensors, microphones and infrared sensors, among others. By taking input from a given sensor array and processing the data, the system can determine the positioning of a user in relation to a computer screen or source device. Also taken into account is the user's environment, for example a large room with wooden floors, or a small room with drawn curtains.
A multitude of inputs are covered in the patent, including cameras that monitor eye movements (gaze tracking) or facial recognition to calculate where a user is looking. Microphones can gauge the level of reverberation in a room and ambient brightness can be measured with light sensors.
Add New Comment
Telepresence Options welcomes your comments! You may comment using your name and email (which will not be displayed), or you may connect with your Twitter, Facebook, Google+, or DISQUS account.
See what happens when YouTube and TPO come together at the Telepresence Options YouTube Channel.