"How are you going to interact with an invisible computer?"
Ivan Poupyrev, who posed that question (and many more) works at Google's ATAP research lab and is the technical project lead for Project Soli, which is designed to prove that tiny radar chips can be embedded into electronics so that we can use minute hand gestures to control the digital world around us.
It is still in the prototype stage, but the video shows how the RADAR “picks” up your hand and uses it to see how close it gets to the watch. Depending on how close you are, the information shown on the display keeps changing. Once it is close enough, you can use your hand and the RADAR to make gestures that help you control the gadget. On a simple note, its like, you snap your fingers in the air just a couple of inches away from it and the digital watch face starts spinning.
Nick Gillian, lead machine engineer for Soli, and his team settled for basic gestures. There are essentially two zones, near and far. From far away, you don't do much (or you could wave your arm around, Kinect-like). But when you get close, Soli is able to detect finer and finer movements. So the first gesture is simple: proximity. As you move your hand closer to the watch, it lights up, showing you information and letting you know your hand is in the zone of real interaction.
The first Soli prototypes are an LG Watch Urbane and a JBL speaker, and neither is anywhere near close to being a consumer product. Speakers vibrate to make sound, so there are hard problems to solve when you're trying to add a chip that detects millimeter-scale movements. The watch still has power and interaction issues to suss out.
But ATAP isn't just building these to prove they're possible in theory. They're doing it in partnership with LG, Qualcomm, JBL, and others to prove to these companies that they can put them in real, shipping consumer products.