Gesture Interaction for AR, VR, and Mobile Devices

As per the 2015 Google announcement, our devices’ future could drastically change from the use of touchscreens and buttons. Google’s Soli project will replace the latter with simple thin air hand gesture interaction for Augmented Reality, Virtual Reality, and mobile devices. The project has been backed up and given the green light to proceed on by the US Federal Communications Commission, which believes it will perfectly serve the public interest.  It will make it easy to perform tasks, like playing in an online casino, learning, healthcare, and banking, among other top industries.

A unique radar-driven interface

Google believes that its proposed functionality will work by interpreting essential readings through Radar Cat. The Radar cat works like any other radar system whereby a base unit shoots electromagnetic pulses at a target, causing some to rebound and return to its starting point. Once this happens, the Radar Cat measures, plots, and collates the readings and makes a unique radar fingerprint for AR, VR, and mobile devices. With this technological advancement, it is no doubt that you will be able to play your favorite casino game without touching your device screen button. So far, the Soli’s interaction has achieved counting, stacking, ordering, moving, and orienting various items, say Lego blocks and cards.

Potential sophistication

Although gesture interaction can be achieved through other ways, the radar method stands out to be the best. The benefits that come with using it include:

  • It does not alter your mobile, AR, and VR devices in any way.
  • It has no privacy imputation
  • Does not ask for quality visual contact
  • It works effectively in both light and dark vicinities.

All in all, every good thing has to have a hidden weakness, and radar is no exception. Its suave analysis may trigger the wrong information when captured items curve themselves because of its ultimate sensitivity caused by radar measurements.

Possible applications

Soli’s project focuses on slimming down the modules needed for this technology to fit into smaller chips since it uses radar points that track hand movements that are as small as a millimeter. This is a possible way of exercising a remarkable scope of your mobile device, TV, speakers, and smartwatch, personal control.

Soli is an incredible project and the responsibility of Google’s Advanced Technology and Projects (ATAP), which unfortunately has a bad reputation for not completing tasks. We have heard of ATAP’s Project Ara, Abacus, and Vault, which faded into thin air before they saw the light of the day. However, all is not in vain for ATAP. Some of its projects have been a success, such as a project, Jacquard and Tango. Although Tango was initially canceled out, some of its technology is used by Android’s ARCore, which Google uses for building their Authenticated Reality experiences. With all the shortcomings and few successful projects, the question remains whether Soli will be successful.

Final Verdict

If the Soli project succeeds, it will be a game-changer for impaired people and those with limited mobility. You will merely be tapping your thumb or index finger towards your mobile phone and smartwatch, and it will be immediately simulating action.

Leave a Reply

Your email address will not be published. Required fields are marked *

We use cookies in order to give you the best possible experience on our website. By continuing to use this site, you agree to our use of cookies.
Accept
Privacy Policy