Monday, 8 June 2015

Google: Your Hands May Be the Only Interface You’ll Ever Need


Project_Soli
Project Soli – Hands Only User Interface

Wearable devices tend to get smaller and smaller and our hands and fingers find it difficult in keeping up with them. Hence gearheads at Google’s Advanced Technology and Projects group – ATAP have been working on ways in controlling devices with simple hand gestures. Google ATAP is aware that the hand is the best method for interaction with devices though not everything seems to be a device. Project Soli the latest wearable is not a wearable one would think, neither a watch but ‘you’.

Project Soli intends to make the hands and fingers the only user interface that would be needed. It is actually radar which is adequately small to be fitted into a wearable like a smart watch where the small radar tends to pick up on the movements on real time, utilising the movements made to modify its signal. While the hand rests, it is actually moving slightly which tends to end as a baseline response on the radar and moving the hand away or from side to side with regards to the radar could change the signal and amplitude. Crossing the fingers or making a fist could also change the signal.

ATAP to have APIs

In order to make sense of the signal to a service or app, ATAP would be having APIs which could tap in the deep machine learning of Project Soli. However, it is still early for the new entry though it has got the people at Google I/O eager and instead of going hands-free, Project Soli would make the hands the UI which could be a better option than voice control could be. Ivan Poupyrev, technical program lead for Google ATAP comments that `should Project Soli become a reality someday, users would be able to control smart watches, fitness trackers as well as other devices with the use of gestures in utilising their smartphone, only without the phone’.

 He informed the audience at Google I/O that their hands tend to be always with them and are ergonomic and could be the only interface ever needed in controlling wearable devices. Nevertheless, gesture based interfaces tend to already exist and several of them such as the Kinect or PlayStation Move utilise camera in detecting the hand movements. Thus they work only when there happens to be clear lines of sight which are worthless in the dark.

Flares out Radar Signals – Numerous Antennae

The Soli chip is said to flare out radar signals from numerous antennae, thousands of time each seconds creating an arena that could measure the minute movements of the fingers, detecting when the fingers are crossed or record signals at once from both hands. It is a kind of software developers generating a vocabulary in interpreting each gesture, translating it into action similar to deleting a message or taking a call on the smartwatch.

The latest radar chip has the capabilities of a range of 5 feet or more, enabling it to be integrated to all types of devices which include the walls of the home. Probably there may be a time in the near future that one may be capable of turning on the lights on snapping the fingers or turn up the volume on Sonos by turning an imaginary dial in the air. Presently Project Soli is just a research project.            



No comments:

Post a Comment

Note: only a member of this blog may post a comment.