(Note: This is a fictionalized account of how a journalist may have covered the paper, “Soli: Ubiquitous Gesture Sensing with Millimeter Wave Radar” by Jaime Lien et. al. None of the quotations below are true and must be interpreted as fiction only.)
Google Advanced Technology And Projects Group
Gesture-based interaction with your TV or smart watch may finally move away from the realm of science fiction to reality, thanks to recent research at Google’s Advanced Technology and Projects (ATAP) group.
The team recently unveiled Soli, a compact low-power gesture sensing technology based on millimeter-wave radar. “Soli marks a departure from previously developed technologies that have used computer vision tools to identify gestures. While research in computer vision continues to advance, in its current state the technology isn’t advanced enough to enable real-time and device-free gesture sensing,” says the lead author of the study. “We have significantly bridged the gap between prior work and the futuristic vision of device-free human-computer interaction by using millimeter-wave radar instead of cameras.”
Radar, short for radio detection and ranging, is a technology that first originated in the Second World War to detect and track aircraft and missiles. One large benefit of using radar signals is their capability to function even in the presence of occlusions or fog. By using millimeter-wave radar technology, the researchers seek to leverage advances in semiconductor devices that allow their system to be implemented as a compact implantable chip.
However, the main issue with using radar in an application like gesture sensing is the extreme difference in scale between mainstream radar hardware and the requirements for integration with modern consumer devices, like smartphones or wearables. In addition, prior work on radio-based sensing has focused on large scale limb or movement tracking, as opposed to the fine-grained hand and finger gestures such applications demand. Soli is the first radar system that enables tracking of fine-grained finger and hand movements at very short distances while meeting the demands of low power consumption and miniaturization.
“Difficulties in adapting current radar technology for our application forced us to innovate from the ground-up,” says the lead author. “We have developed a novel end-to-end pipeline ranging from the radar signal processing to the a machine learning model for gesture identification and classification.”
Their system works by transmitting short radar pulses at high frequencies to illuminate the hand. The researchers model the hand as a collection of distinct points that reflect the signal back to the system, where the reflected signals are re-collected. The received signals are then decomposed into a set of features that describe the hand’s orientation and motion. These features are then fed into a machine learning model that identifies the particular gesture being performed.
Due to the specifics of using radar in their application, Soli is only capable of identifying motion-based gestures. For instance, Soli can classify four such gestures – clicking an imaginary button, turning an imaginary knob, sliding a virtual slider and swiping left or right. “We call these movements action gestures,” says a senior member of the study. “What’s interesting is the relevance of these gestures to the kinds of actions a user might perform while interacting with, say, their smartphone. For example, a user might turn a knob to indicate that the volume must be lowered during a call, or swipe left to hang up.”
“We are extremely excited to explore the infinitely many applications Soli enables,” says another senior researcher and overall project leader. “We believe gesture sensing is a novel paradigm that will fundamentally change how humans interact with machines in the future. We hope our work will inspire other researchers to explore the immense capabilities provided by millimeter-wave radar technology, especially to the ever-evolving domain of human-computer interaction.”
###