FEB 17, 2018 7:45 AM PST

Google Patents Hand Gesture Controls for Home and Car

WRITTEN BY: Julia Travers

Voice control of the environment has already entered many people’s lives though the latest remotes or smart home devices such as the Amazon Alexa. On Feb. 15, 2018, Google filed two patents for what may be the next wave of remote-sensing ease: using simple hand gestures in the air to send commands and manipulate choices and options on a screen or for a variety of machines. The corresponding device might be worn on your wrist, as in the picture below, and would emit a radar field to a receiver on the product you wanted to influence. Radar uses radio waves to detect objects, surroundings and movement between them.

image from Google's patent, credit: Google

Google’s Gesture-Sensing Radar Field Patents

“This radar system can both transmit data and sense gestures, thereby performing with a single system, control of many devices and data transmission with those devices,” the first patent states, mentioning that it could control many types of appliances and devices “from refrigerators to laptops.”

One example Google’s team gives in these remote-sensing patent applications is that a user could swipe songs between a radar-sensitive phone and stereo by swishing their hand through the air from one object to the other. They could then choose tracks and control volume with other gestures. It brings to mind the type of gesture-based content manipulation scenarios seen in the 2002 sci-fi movie, “Minority Report,” without the midair visual interface. AR and VR integrations seem a natural potential choice for this type of technology.

The second patent is intended to “provide a gestural interface in a vehicle” and includes a picture of a smartphone in its illustration, which would apparently be involved in the “gesture indication” and “context data” functions of this system.

Google’s Project Soli

These radar-sensing techniques may mimic what Google already unveiled with its Project Soli in 2015, as Kris Wouk of Digital Trends reported. Project Soli was presented at Google I/O, an annual developer conference. Technical Program Lead Ivan Poupyrev showed how Google was already developing radar to track hand gestures, including movements of less than a millimeter. These can be used to turn virtual dials or to interact with virtual buttons or sliders. The sensor and antenna array for Project Soli are packaged into an 8 mm by 10 mm chip that can be embedded in phones, computers, vehicles, wearables or IoT devices.

“Soli has no moving parts, it fits onto a chip and consumes little energy. It is not affected by light conditions and it works through most materials. Just imagine the possibilities ... ” the company’s site states.

In the future, it appears likely that many people will get to try new and perhaps more intuitive systems for controlling objects, products and systems in the world around us; with the use of gestures through the air rather than solid devices in our hands.

About the Author
Bachelor's (BA/BS/Other)
Julia Travers is a writer, artist and teacher. She frequently covers science, tech, conservation and the arts. She enjoys solutions journalism. Find more of her work at jtravers.journoportfolio.com.
You May Also Like
Loading Comments...