Monday, June 25, 2012

MIT projection system extends video to peripheral vision, samples footage in real-time

MIT projection system extends video to peripheral vision, samples footage in real-time

Researchers at the MIT Media Lab have developed an ambient lighting system for video that would make Philips' Ambilight tech jealous. Dubbed Infinity-by-Nine, the rig analyzes frames of footage in real-time -- with consumer-grade hardware no less -- and projects rough representations of the video's edges onto a room's walls or ceiling. Synchronized with camera motion, the effect aims to extend the picture into a viewer's peripheral vision. MIT guinea pigs have reported a greater feeling of involvement with video content when Infinity-by-Nine was in action, and some even claimed to feel the heat from on-screen explosions. A five screen multimedia powerhouse it isn't, but the team suggests that the technology could be used for gaming, security systems, user interface design and other applications. Head past the jump to catch the setup in action.

Continue reading MIT projection system extends video to peripheral vision, samples footage in real-time

MIT projection system extends video to peripheral vision, samples footage in real-time originally appeared on Engadget on Mon, 25 Jun 2012 04:55:00 EDT. Please see our terms for use of feeds.

Permalink Gizmodo  |  sourceMIT  | Email this | Comments


game changer selection sunday corned beef recipe time change daylight savings rpi dst

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.