Google's Project Soli Imagines the Future of UI for Objects

By: Laura McQuarrie - Jun 2, 2015
References: youtu.be & digitalbuzzblog
Project Soli from Google's ATAP is an exploration of how hands and fingers can be used to control ordinary devices without a touchscreen. The technology at work is a new high-tech micro radar sensor, which redefines what it means to own a piece of wearable technology.

This chip is so small, it can be embedded into objects in place of a button; imagine being able to pinch your fingers together to make menu selections or rub your fingers to activate scrolling. This is the same kind of technology that is used to track cars, planes, satellites and other big objects—but now it's being used in reverse to track tiny micro-movements.

Google's Project Soli asks a question that will lead the future of UI design: how can manual movements of the human hand be utilized by the virtual world?