Project Soli can now recognize objects in real time
Project Soli is one of the coolest projects we’ve ever seen out of X (formerly known as Google [x]). Soli uses a radar sensor to pick up hand movements in space – making interactions with your smartwatch possible via hand gestures rather than through screen-based taps and swipes. As if that wasn’t already cool enough, Soli has just been granted superpowers allowing it to recognize objects.
But it wasn’t Google that gave Soli these new abilities. Researchers at the University of St. Andrews, who were given a Soli AlphaKit by Google, souped it up in a project dubbed RadarCat. RadarCat is so advanced it can instantly and accurately identify various objects placed on Soli’s radar.
RadarCat can not only identify the various hand gestures that so impressed us about Project Soli, but it can also make out the difference between apples and oranges, as well as between full, empty and half-full glasses of various liquids – all using radar alone. RadarCat can even distinguish between different parts of your body, so it knows if its “looking” at your hand, your wrist, your forearm or your leg.
Radio waves bouncing off an object create a very specific pattern akin to its fingerprint.
RadarCat does this by using radio waves to bounce off an object touching the Soli sensor. This creates a very specific pattern akin to the object’s “radio fingerprint”; think of it like the echo-location mechanism bats use.
These radar-based echoes are so accurate and stable that RadarCat can even differentiate between various materials like glass, steel, plastic and copper or even which side of an object is lying face down on the Soli sensor. Because RadarCat employs machine learning, the list of recognizable objects only grows over time.
The applications for RadarCat are huge, and the researchers came up with a variety of use cases for the technology, some of which you can see in the video above. From displaying nutritional information about fruits and vegetables placed on the sensor to automatically bringing you a fresh drink when your glass is empty, RadarCat has a lot of potential.
But the St. Andrews team aren’t the only ones doing research into gesture recognition capabilities on wearables. Researchers from Carnegie Mellon University (some of whom came up with a very clever new device pairing technology we covered recently), have also hacked a smartwatch’s accelerometer to basically make it work like Project Soli, but without the need for a radar.
Speeding up a smartwatch’s accelerometer makes it work like Project Soli, but without the need for a radar.
Called ViBand, the project uses a custom kernel to ramp up the sample rate of a smartwatch’s accelerometer to 4000 samples per second. Where a typical accelerometer performs at 100 Hz, the ViBand operates at 4000 Hz, making it much more sensitive to even the finest of movements in the immediate vicinity.
Bio-acoustic signals – essentially micro-vibrations that travel through the wearer’s arm – are picked up by the accelerometer and identified by software. ViBand can be used to manipulate virtual buttons, sliders or dials as well as to recognize larger gestures like claps, flicks of the wrist and taps on various locations of the arm and palm.
The range of gestures are just as wide as Project Soli, although they are admittedly a little more “rugged” than some of the fine gestures demoed by Google. Still, to be able to recreate Soli-like results on an existing smartwatch simply by ramping up accelerometer sensitivity is an impressive feat.
As far as battery consumption goes, the researchers tell me ViBand only uses around twice as much compared to a normal accelerometer, which is admittedly very little. The team is quick to point out that this is more of a proof of concept than a final solution though. “In a real commercial version of this, a special co-processor would be used, like what “OK Google” uses now – something special purpose and very low power.”
Are you excited to see Project Soli, RadarCat and ViBand come to market? What applications can you imagine for them?