Everyone who watched 2002’s Minority Report will certainly remember how Tom Cruise used to manipulate data on screen with geekgasm-inducing gestures. While we haven’t reached the same level of a smooth, gesture-driven user-interface yet, real life is slowly but steadily reaching Hollywood levels of epicness.
Microsoft released a beta version of the Kinect SDK for Windows just ten days ago. And today, we are seeing some of the first results of it in the form of a Minority Report-like UI for Windows from developer Kevin Connolly.
Dubbed as KinectNUI (Natural User Interface), the work-in-progress project attempts at bringing some of Minority Report’s gestures to the world of Windows. If you haven’t watched Minority Report or have forgotten the UI, here’s a clip from the movie which ought to help out.
Gestures don’t work particularly well and the time it takes for the gesture to get processed and for the output to come on screen is way too long. But this is completely understandable since the project has been under development for only a few days. It’s more of a proof-of-concept than a fully functioning product.
Besides this, Windows 7 itself really isn’t optimized for touch or gesture-based input.
There are around ten or so gestures showcased. Some of which work, others which don’t work at all. There are gestures for activating Aero Flip, maximizing/minimizing a window, dragging a window across the screen and zooming in and out.
KinectNUI also supports head tracking, so if you’re zoomed in and you move around, the screen will zoom in to different parts of the screen.
Watch the demo below:
One interesting gesture that is planned for NUI is a pie menu (very much similar to The Sims) that will enable users to pull off more tasks.
Kevin Connolly can’t do this alone, of course. He has open-sourced the project so you can help him out in realizing every geek’s dream of seeing Minority Report-like UI work well before they die.
Anyone and everyone can go ahead, download the code and add their ideas to KinectNUI. Since it is in alpha stage, so there is a lot of tweaking to be done to ensure that gestures work properly.