How Google plans to reinvent the user interface

When the late Apple CEO Steve Jobs introduced the iPhone 12 years ago, he also introduced to many the concept of the multitouch user interface. He emphasized the benefit of using “the pointing device we were all born with”: our fingers.

But there’s one thing even more natural than physically poking something with our fingers: in-the-air hand gestures. All humans use hand gestures to communicate to other people.

Now Google wants you to use them to communicate with all your electronics.

Google is working on something it calls Project Soli, which uses radar to control electronics with in-the-air hand gestures.

Soli is quite amazing, actually

Announced in spring of 2015, Soli enables in-the-air gestures to control smartphones, computers, wearable devices and even cars. The project is in the news this week because the FCC just granted a request by Google to operate Soli radar sensors at higher powers than currently allowed in the U.S. It also gave permission for using Soli devices on airplanes.

Soli emerged from Google’s Advanced Technology and Projects group (ATAP), which itself was created inside Motorola Mobility by former DARPA director Regina Dugan. Google acquired ATAP when it bought Motorola in 2012, but kept it when it sold the company to Lenovo two years later.

Research labs are a dime a dozen in Silicon Valley, throughout the tech industry and in universities everywhere. They’re great at producing great technology that never makes it into real products.

ATAP is different in that all projects are expected to move from conception to shipping product in two years. (It often misses that target, but the point is to move aggressively toward productization.)

Leave a Reply

Your email address will not be published. Required fields are marked *