Ever wanted to use your iPad but don’t have enough stupid hands to multitask? No? Alright then. Do even have an iPad? Well, that’s too bad. I’m still gonna tell you about this new tech though so deal with it.
According to the grocery store copy machine flyer given to me at this show floor demo, literally sitting on the floor, “building upon the spatial-tracking success of the best-selling applications Wall Painter and Home Decorator,” this middleware will use the front facing camera in the iOS devices and enable motion tracking for use in games and apps. But seriously, Wall Painter. Home Decorator. Best selling.
Demonstrating the tech in a basketball game VisionBall, the camera is able to detect the depth of your hand and individual fingers and gestures like shooting a basketball will, obviously, shoot a basketball. The tech is very similar to early EyeToy games and, more recently, Kinect games… like Sports Champions.
The tech has been in development for six months now by one single man, Joel Teply. According to his representatives, Teply is in deals to collaborate with CNN and Pixar. Yeah.
The demo tech was made in a week before E3 and had some hitches especially with the convention center lighting. There were multiple options to fully calibrate the tech and by multiple options, I mean sliders. Like Jerry O’Connell’s run on FOX, these won’t make it into the final product. Janky is where the tech stands right now but still, motion controls is the equivalent to a VD for video games.
Where this tech’s real focus was was at WWDC, showing off its VisionBalls to the heads of Apple and getting some great buzz from developers, according to Nathan Evans, project manager at Digital Rising. This E3 demo was just to show that something is coming and in the works. I don’t blame them for having troubles with the tech since the rush it had endured, but to prove that there is something in the works is beneficial. I still stand by my views on motion control in games but as a use in different applications like navigating through the OS, I can see the potential. I’m lazy so waving my hands around and actually having those actions do something seems beneficial, especially lying down in bed just inches away from my computer.
Upon further discussion with Evans, the main difference between this tech and Kinect is that it isn’t tacked onto an otherwise controller based navigation system. iOS implements all touch so this tech’s finger recognition might prove better than Kinect’s often janky navigation.
Kinect comparisons aside, this tech is still work in progress but the potential still stands in making every iPad user a bit more lazier. Seeing the VisionBall demo didn’t very much pique my interest, due to its early bugs and unsurprising game concept, but if this tech can get us closer to that “Minority Report” style of navigation, as what the Move and Kinect initially promised, I’m all for it.
Until then, shoot some VisionBalls, paint some walls and decorate some homes. Just don’t touch me.