Google I/O 15 - Project Soli

As Google I/O drew closer in the year, it was hard for techies not to get excited about what could be unveiled. Every year there is usually something that spurs us into a frenzy. Last year was the hugely appreciated reformed Android operating system, Lollipop (L, for short), which forced app developers to rethink the design of their existing apps. This year Google has unveiled Android M, L’s successor. This OS is still in the developer preview state, and as such has not been released to the general public. Google has decided to keep to themselves what the ‘M’ stands for, at least for the time being. I’m not entirely sure why. Perhaps it’s another way to spark discussion about the new OS by having bloggers and video producers ask their viewers the question. Free advertising. Well played, Google. Android M doesn’t quite have the oomph that Android L did. There’s no new “big feature”; just some added existing technology that other companies have had for a while, such as fingerprint scanning and a pay service. There’s no major UI changes, apart from the App Drawer; which, by the way, has been hated on by most. Overall, nothing special. Nothing to be overly thrilled with. Let’s leave Android OS aside and focus on what else Google I/O 15 had to offer.

One thing that stuck out from the rest, and genuinely had the crowd squealing like schoolgirls, is Google ATAP’s Project Soli. The brainchild of Ivan Poupyrev, Project Soli is a hypersensitive radar intended to track motion of the human hand. Poupyrev is fascinated by the range of movement the human hand provides and his wish is to harness it by allowing you, the user, to be the interface for devices. The hardware itself is a small chip no bigger than a thumbnail which transmits radar waves towards a target and the radar receiver intercepts the reflected energy from that target. The reflected energy in question would come from tiny movements made in range of the device, such as a snap of your fingers. Here, Poupyrev demonstrates his concept.

Some of you eagle-eyed viewers may have noticed that when Poupyrev attempted to change the hour on the clock to 9, he completely missed. The hardware needs refining, and so far I can’t see it being very useful in it’s current state. That’s not to say Google ATAP aren’t onto something here though. Give the team some time and who knows what this technology could lead to.

Take smart watches for example. They are the technology of today. Not everyone has one, and not everyone sees the point in having one, but there will come a time where they will be ubiquitous, just like cell phones. Combining smart watches with Project Soli would be a match made in heaven. According to Poupyrev, the radar works through materials. You could be wearing long sleeves and gloves, and yet could flick your wrist to activate the watch, skip a track, accept calls, and change the volume, all without touching the watch or looking at the screen.

Imagine this technology coupled with virtual reality devices. Currently most VR companies are struggling with a way to precisely track 3D positioning of objects located relative to each other. With Project Soli, your headset could beam a menu in front of you in which you could control by hand gestures. Consider using Maps on the optical head-mounted display of Google Glass to get to a local coffee shop. Perform a few gestures and you could be viewing a 3D street walkthrough of how to get to the shop before you even take the first step. This might not catch on as quickly as Google would hope, however. The problem lies with what is classed as acceptable social behaviour. Can you imagine how odd you might look flinging your arms about for no apparent reason? It will take time for people to adjust to this being normal, just like it did with people chatting away to seemingly no-one via a Bluetooth headset. But with the countless possibilities of what Project Soli can link with, I have absolutely no doubt that Google are onto a winner here.

Written on May 31, 2015