Image via zedomax.com
A team of crack computer wizards have pulled off quite an accomplishment. Marshall Kirkpatrick of ReadWriteWeb is reporting that the aforementioned group of "scientists" have crafted software enabling anyone to perform "on-the-fly analysis" of live streaming video on the iPhone.
Used alongside existing methods of displaying data on top of the camera's view, this new functionality signals a fundamental change in the kinds of Augmented Reality (AR) that iPhone developers can create. Existing AR apps, like Yelp, Layar, Wikitude and others display data on top of a camera's view but don't actually analyze what the camera sees. This new development changes that.
This A-Team of iPhone hackers is composed of the geniuses at the Visual Media Lab at Ben Gurion University. They worked with HIT Lab NZ to write the code. Together they revealed the hack at the AR-specialist blog Games Alfresco.
In a demonstration video the team showed how software built on top of the now-exposed API could look at a 2D image drawn on paper and render the image in 3D. Then the 3D rendering is subjected to a physics simulation.
The possibilities here are huge. While location-based AR is clumsy at best so far, due to the imprecise nature of GPS and mapping data, these kinds of object-centric AR tied to the actual viewed world open up a whole new world of potential developments. Let's see what you've got, AR devs of the world!