Touchscreens are everywhere these days. Now, thanks to Seattle-based startup Ubi Interactive, just about any surface can be a touchscreen. All you need is a Windows 8 PC, a projector, a Kinect sensor, and Ubi's software. Now out of beta, the software interprets movements picked up by the Kinect sensor, which it then uses to activate Win8 gestures. Here's the requisite promo video. My apologies for the cheesy music.
I see a bit of latency between hand movements and their effects, though perhaps not enough to spoil the experience for simple applications. Ubi claims the depth sensor is precise enough to know whether the user is tapping on the projected screen or hovering just above it. The firm says the system may not work with some plasma screens and glass surfaces, though.
There are also limitations associated with the distance to the Kinect sensor. When the virtual screen is within 31.5-55", "finger mode" provides enough sensitivity to track individual digits. Position the depth sensor 55-84" from the screen, and you get the coarser "hand mode." Finger mode only works with screen sizes up to 45", while hand mode scales up to 100".
At $149 for the base package, the software isn't cheap. The Basic option only supports one simultaneous touch point, finger mode, and forum-based support. You have shell out $380 to power a 100" screen and $799 to do so with two-point input. The $1499 option boasts same-day phone support and up to 20 simultaneous inputs on screens up to 100". Regardless of the package, you're only eligible for free software updates for one year.
Although the pricing is probably too steep to lure curious enthusiasts, the technology is intriguing, especially given the higher resolution offered by the Xbox One's Kinect system. (That's coming to the PC next year, by the way.) Motion tracking may not replace traditional PC inputs, but there are lots of interesting ways it could augment the overall experience. Thanks to CNet for the tip.