Imagine opening your favorite recipe app just by holding your smartphone to your stomach. Or as soon as you take a sip of the last bit of water, a server is alerted and knows immediately to fill your glass. This is the technology behind RadarCat (Radar Categorization for Input and Interaction) – a device that identifies many kinds of objects. Whether it’s identifying the contents inside a glass or recognizing the difference between body parts, the University of St. Andrews Computer Human Interaction (SACHI) research group is testing the boundaries of machine learning.
With RadarCat still in its conceptual stage, the biggest challenge for the SACHI group is progressing towards an actual product. Using a sensor provided by Google’s ATAP (Advanced Technology and Projects) division, the new technology can identify materials and objects in real time. Professor Aaron Quigley, Chair of Human Computer Interaction at the University said, “The Soli miniature radar opens up a wide range of new forms of touchless interaction. Once Soli is deployed in products, our RadarCat solution can revolutionize how people interact with a computer, using everyday objects that can be found in the office or home, for new applications and novel types of interaction.”
Whether it’s identifying the different contents of two identical bottles or automatic waste sorting at a recycling facility, there’s an ever-growing list of practical real-world applications awaiting RadarCat’s technology. “Beyond human-computer interaction, we can also envisage a wide range of potential applications ranging from navigation and world knowledge to industrial or laboratory process control,” Professor Quigley said.
In a recent iReviews article entitled, “AI Still Not Perfect: Experts Give Three Reasons Why,” Neil Lawrence, Professor of machine learning at the University of Sheffield and part of Amazon’s AI team, explains one of the biggest problems facing Machine Learning: “these systems don’t just require more information than humans to understand concepts or recognize features, they require hundreds of thousands of times more.” Lawrence says that huge tech giants like Google, Facebook and Microsoft are perfect resources for AI. “They have abundant data and so can afford to run efficient machine learning systems.” Thankfully, Quigley and the SACHI group are using Google’s most innovative machine learning technology developed in their Project Soli. The technology torch has been passed to the scientists at the University of St. Andrews.
If technology is groundbreaking, it usually has a positive effect on society. RadarCat, once it becomes a prototype, is capable of helping the blind identify the different contents of two identical bottles, can recognize household objects like computer keyboards, and can assist in foreign language learning. The future of RadarCat will depend on advances in machine learning technology. With its Google partnership, the SACHI Group appears to have access to one of the most important machine learning resources: data.