Last year, Google unveiled Project Soli, the company presented it as a way to create gesture controls for future tech. Soli’s miniature radars are small enough to easily fit into a smartwatch and can detect movements with sub-millimeter accuracy, allowing you to control volume of a speaker, say, by twiddling an imaginary dial in the mid-air. But now, a group of researchers from University of St Andrews in Scotland have used one of the first Project Soli developers kits to teach Google’s technology a new trick: recognizing objects using radar.
The scientists are using machine learning to help the radar identify different materials, such a air, steel and even specific items. The radar can also detect if it’s closer to an orange or apple, individual body parts, and an empty glass or the one full of water.
What’s Soli chip:
Google’s Soli chip is smaller than a quarter, measuring 8mm x 10mm and packing both the sensor and the antenna array. According to the Google, this chip broadcasts a wide beam of electromagnetic waves. When an object enters the waves, the energy is scattered is a specific way relative to the object. Thus, the sensor can get specific data from the energy pattern such as size, shape, orientation, and material.
The next step for RadarCat’s creators is to improve system’s ability to distinguish between similar objects, suggesting they could use it to not only say whether a glass is full or empty, but also classify its contents. If the tech ever moves into mainstream use it’d be quite the evolution — from a military technology used to detect ships and airplanes, to a consumer one that can tell you exactly what you are about to drink.