Google’s Future of Radar Based Technology For Hand Gestures - Wearable
Google has its focus on future of radar based technology for hand gestures with wearable and the future, where one could interact with wearable technology without the need of physical control like buttons and the fingers could act as buttons.Project Soli had been announced as an interaction sensor in making use of radar technology, earlier this year. Regarding Soli, a Google Advanced Technology and Projects division – ATAP video had been posted in May giving information about the same. The technical program lead at ATAP started off by, introducing himself as Ivan Poupyrev, adding that capturing the possibilities of the human hand had been one of his passions and how could it be applied with the capabilities to the virtual world.
He informed that they use radars and there is nothing to break, no lenses, no moving parts. He said in the video that radar has been utilised for several things, like tracking cars, big objects, satellites as well as planes. It is being used to track micro motions, twitches of human hand and use it to interact with wearable and other computing devices.
Project Soli – Tiny Chips with Radar like Potentials
According to M Dee Dubroff, from InventorSpot.com `Project Soli focuses on the fact that the ability to function is not always the job of the device but the movements of the user especially hands and fingers. She informs that the project depends on tiny chip with radar like potentials and can pick up the smallest of movements.Details of the ATAP presentation earlier this year had been provided by Alex Davies in Tom’s Hardware, when Poupyrev showed hand gestures and how they could be related for interactions. He informed about rubbing thumb and index finger together in order to simulate turning a dial, changing time on a simulated watch face and use distance in controlling if the hours or minutes were adjusted, based on how far a hand was from the sensor.
Radar tends to have some unique properties such as very high positional accuracy according to the video where the smallest movement can be traced.
Radar Hardware into Gesture Sensor
Lead research engineer, Project Soli, Jaime Lien said that the focus of the team was in taking radar hardware and turn it into a gesture sensor. Latest information is that Google has been notifying developers of a forthcoming Project Soli Alpha DevKit. Liam Spradlin had written in Android Police that Google had sent application for a small group of Project Soli dev kits.He informed that according to an adviser Google had begun notifying interested groups of an upcoming `Soli Alpha DevKit’, asking those notified to fill out application for the chance to receive the same.Spradlin further added that the email stated that those selected to receive it would get a development board and SDK together with the opportunity of participating in a Soli Alpha developer workshop somewhere in the future.
Potential developers in return were asked to participate in the private user group, to accept software updates and make something really cool. Special note had been made by Poupyrev in the video which was done earlier this year and he is looking forward in releasing this enterprise to the development group
No comments:
Post a Comment
Note: only a member of this blog may post a comment.