Google’s Project Soli to bring hand controls to wearables

By This email address is being protected from spambots. You need JavaScript enabled to view it. nfcworld.com Published 2 June 2015, 13:45 • Last updated 2 June 2015, 13:34

Project Soli

PROJECT SOLI: Hand-gesture technology aims to enable rich interactions with smart devices

Google is working to create next-generation, hand-gesture technology that would enable rich interactions with smart devices regardless of screen size to “drive applications across wearable, home automation, automotive, industrial and medical markets”.

A video produced by the company’s Advanced Technology and Projects (ATAP) group shows a demonstration of how Project Soli works:

“The hand is the ultimate input device; it’s extremely precise, it’s extremely fast and it’s very natural for us to use it,” says ATAP’s Ivan Poupyrev, founder of Project Soli.

“We use radio frequency spectrum, radars, to track human hands. We’re using them to track micro motions, twitches of the human hand, and then use that to interact with wearables and Internet of Things and other computing devices.”