LAS VEGAS—In 2010, Microsoft's Kinect for Xbox showed that a computer really could see and understand what you are doing. But that required a bulkLASy, $100 add-on device that worked only with one game console (and now with PCs, thanks to a software developer kit Microsoft released in 2011).
At CES this year, an Israeli firm called Extreme Reality demonstrated the same body- tracking technology, minus the Kinect or Xbox. In fact, the software was running on a ho-hum Lenovo laptop with a built-in webcam.
Instead of the Kinect's sophisticated setup with a camera, infrared beam and infrared receiver (all developed by a company called PrimeSense ), Extreme Reality uses clever algorithms that can extract a lot of data from simple — not even HD —video.
An employee demonstrated this at the show by jumping around in front of that Lenovo laptop to control games that were displayed on an HDTV. In one, she took the online form of a panda dancing around the screen. In a racing game called "Beat Booster," her character rode perched atop a rocket engine, controlled entirely by her gyrations. Game maker Current Circus developed the game together with Extreme Reality.
The game is currently available for Windows 8 PCs, but Extreme Reality plans to expand much further. Maya Gershon, the company's VP for sales and marketing, told TechNewsDaily that the company has already developed an Android version of its underlying software. And an iOS version is on the way, she said, to debut in a game they are creating with "a big game developer."
The motion-enabled games could in the future also run as apps in smart TVs — many new models of which debuted at CES.
Other companies have tried using a simple camera to track movements and control menus — often with disastrous results, as in the case of Samsung's latest smart TVs. Gershon demurred talking about specific companies, but she gave some hints as to why firms like Samsung have failed. Instead of looking at just the hand that is supposed to move a cursor on a TV menu, for example the Extreme Reality software looks at the entire arm — in fact, the entire body, to get a better sense of what exactly the motion is. The algorithm also applies some common-sense logic, she said. For example, it assumes that a left hand probably won't end up on the right side of the user's body. [See also, Waving or Drowning? Samsung Gesture-Reading TVs Can't Tell ]
Gershon also showed how the technology could be used to control menus on a TV or PC. In a photo album program on her laptop, she simply waved her hand right or left (without touching the screen) to move on to the next or previous picture. It's like the swiping motion Apple uses on its iPad photo app, minus the finger smudges on the screen.