Back in the ‘90s, Steve Mann at MIT was strapping cameras to every part of his body. Bloody-minded about his vision of wearable technology, he’s acknowledged as the founding father of a whole new category and industry. The first ‘Glasshole’, he was amazingly prescient. His EyeTap from 1999 with its ‘Glass Eye’ camera is near identical to Google Glass, which arrived 14 years later.
But they never took off, then or now, simply because this form of wearable computing isn’t natural and intuitive. You have to wear something unnatural (a camera and screen on your face) and learn and adopt new gestures and behaviours (jerking your head backwards and looking up to the sky to activate Glass). Not so elegant, and slightly concerning in public.
When we were living in New York we used to have ‘maker sessions’ (tech speak for ‘brainstorm’, where something actually gets made, not just talked about) at the ACE Hotel, near Google’s offices. Occasionally we’d see a Google employee out wearing prototype Glass well before its release. Word had it that Google would only allow attractive people out the building wearing the headsets. And sadly, this didn’t help matters, it made even their gorgeous employees awkward and unnaturally conspicuous. Which is a clue to where wearables are heading: into the realm of discreet, intuitive and easily blending into our daily lives.
Google is slowly course-correcting with its skunkworks team at Advanced Technologies and Products. It’s led by Ivan Poupyrev, a Russian chap we met in New York who was head of innovation at Disney at the time. And its first two projects are already gathering keen attention.
Project Jacquard: ‘Someone hacked my pants’…
Poupyrev and his team realised the mesh-like structure of textiles is the same as the structures of touchscreens that we use on our everyday devices. So if they replace some material threads with conductive threads, they could create a fabric that can recognise a series of touch gestures, just like a mobile phone or tablet. And imagine what you could do with that.
We’re moving away from electronics and now making the basic materials of the world around us interactive. Any fabric can now have interactivity woven in, making computing invisibly integrated into the clothing and able to disappear from our palms and our dinner tables.
Levi’s is the first brand to collaborate with Google on this. I’m excited because I pretty much only wear Levi’s. So if my phone can talk to my pants then I won’t need a separate activity tracker because my pants can do that better, and I won’t need haptic feedback on my smart watch because my shirt cuff could do that more discreetly than an Apple Watch.
Google’s also just invented the best controller in the world—and you already have two of them…
Over the next five years the number of connected devices is expected to almost triple, from nine billion today to around 24 billion, according to the GSMA. And control of these electronic devices will be key. Wearable clothing might control devices on me, but motion controllers will likely manage the computing around me.
Motion controllers admittedly aren’t new: Leap, X Box Kinect and Wii have been around for a while tracking hand and body movements through cameras to effectively replace a mouse or a keyboard. But camera-based controllers have limited accuracy and they don’t work in the dark. Soli, Google’s new controller, uses radar, which detects objects in motion through high frequency radio waves, a bit like a cop on a highway using his radar gun to catch speeders. But this radar tracks the smallest micro gestures of your hand movement in 3D, which means you can make really tiny, nuanced gestures and it will interpret them. Imagine you were talking to a mate across a crowded room. How might you ask them to turn the volume up? Probably with your thumb and index together slowly dialling up the music. Soli will understand that gesture and increase the volume.
What’s the opportunity for brands when screens disappear?
When phones are invisibly tucked into pockets, technology disappears into fabric and gestures become ubiquitous, it begs the question: how will your brand stand out without visual cues? Apple, Xerox and Microsoft have been quietly patenting gestures for two decades. When you pinch and zoom on your screen, or shake your phone to perform an undo action, those are trademarked gestures (curiously, Nintendo Wii never trademarked any of its gestures, just the hardware).
So, what’s your brand’s gesture? How is it different from your competitors and can you own it? Not own the trademark, that’s always tough to protect anyway, but protecting and owning the gesture in the hearts and minds of consumers?