Google is worried that we’re pushing at the limits of what human beings are capable of, or willing to endure, in terms of fiddly interfaces. With devices set to shrink yet again, as wearables become the supposed next tech fad, Google has been looking at alternative input devices to the touchscreen. What it has come up with is a tiny radar chip that can sense gestures made by your hand.
It sounds a bit like Microsoft’s Kinect, but it works on a much smaller scale, looks to detect much finer motions, and just as importantly is far more compact. It can track your motions at 60fps, giving quite fine feedback though when you’re talking about quick and fine gestures it could be improved. It uses a broad radar beam to track your hand.
A demo showed the device accurately tracking a hand making gestures a few inches from the device. By rubbing a finger and thumb together, the demonstrator controlled a virtual dial, and scrolled through a virtual menu. You could also use a finger on a virtual touchscreen, or flick a ball in a game of virtual Subbuteo (we’ve been waiting for it to make a digital comeback).
It all looks very promising and a development kit and APIs will be available later this year, so that developers can start building hardware and apps to make full use of the tech. The idea of making gestures above your smartwatch on a larger ‘virtual’ space should help ease the current issues of interacting with tiny screens.
Project Soli isn’t the only attempt Microsoft is making to crack this problem though. It’s also teamed up with Levi’s on smart clothes that have integrated, woven, touch-sensitive areas. Read all about that in Google partners with Levi’s to make smart clothes.