Home » News & Magazines » Project Soli can now recognize objects in real time

Project Soli can now recognize objects in real time

Project Soli is one of the coolest projects we’ve ever seen out of X (formerly known as Google [x]). Soli uses a radar sensor to pick up hand movements in space – making interactions with your smartwatch possible via hand gestures rather than through screen-based taps and swipes. As if that wasn’t already cool enough, Soli has just been granted superpowers allowing it to recognize objects.

But it wasn’t Google that gave Soli these new abilities. Researchers at the University of St. Andrews, who were given a Soli AlphaKit by Google, souped it up in a project dubbed RadarCat. RadarCat is so advanced it can instantly and accurately identify various objects placed on Soli’s radar.


RadarCat can not only identify the various hand gestures that so impressed us about Project Soli, but it can also make out the difference between apples and oranges, as well as between full, empty and half-full glasses of various liquids – all using radar alone. RadarCat can even distinguish between different parts of your body, so it knows if its “looking” at your hand, your wrist, your forearm or your leg.

Radio waves bouncing off an object create a very specific pattern akin to its fingerprint.

RadarCat does this by using radio waves to bounce off an object touching the Soli sensor. This creates a very specific pattern akin to the object’s “radio fingerprint”; think of it like the echo-location mechanism bats use.

These radar-based echoes are so accurate and stable that RadarCat can even differentiate between various materials like glass, steel, plastic and copper or even which side of an object is lying face down on the Soli sensor. Because RadarCat employs machine learning, the list of recognizable objects only grows over time.

Like Us and get Updates