Tech giant reveals new methods of interacting with devices.
Google made two major wearable technology announcements at its annual I/O conference, but it wasn’t a new watch, or activity tracker, or the rumored next version of Google Glass. It wasn’t a device at all. Rather, it was a pair of deep research projects that have the potential to radically change our perceptions of “wearable technology.”
The first, Project Jacquard, has created conductive thread that can be seamlessly woven into textiles – apparel, furniture and more. “For textile designers or fashion designers or furniture designers, it is interesting because it’s something you are very familiar with. It’s just textile,” says Shiho Fukuhara, textile development and partnership lead for Project Jacquard, in a video released by Google. Fukuhara says it behaves and weaves just like normal yarn, which can either be invisibly integrated into textiles or noticeably raised for users to interact with. “That’s up to the designers to choose,” Fukuhara says. “That’s up to their creativity.”
At Google’s I/O conference in late May, the company had a demo of the technology, which used fabric as a touchscreen input. Attendees could move their fingers over a threaded grid, where a display showed how software was reading their movements in real time – tapping, swipes, pressure and much more.
The technology has robust possibilities. The most obvious example is that clothing can interact with smartphones, perhaps to answer phone calls, acknowledge notifications and much more. Fabrics would also be able to interface with countless other smart devices that now populate homes, cars, offices and more. Google announced it is partnering with Levi’s on the project. “In our hyper-digital world, people constantly struggle to be physically present in their environment, while maintaining a digital connection,” says Paul Dillinger, Levi’s head of global product innovation. “The work that Google and Levi’s are embarking upon with Project Jacquard delivers an entirely new value to consumers.”
The second announcement, Project Soli, introduces sensors that, through radar, detect hand movements which can control smart phones and other devices. “Radar has been used for many different things – to track cars, big objects, satellites and planes,” says Ivan Poupyrev, technical program lead for Google’s Advanced Technology and Projects division, which oversees both projects. “We’re using it to track micro motions, twitches of human hands, and use that to interact with wearables and other computing devices.”
Google’s Project Jacquard (https://youtu.be/qObSFfdfe7I) has created conductive thread that can be woven into textiles to control smart devices.
Project Soli (youtu.be/0QNiZfSsPc0) uses radar to detect hand movements, which could then be used to control smartphones, watches and more.
A video released by Google demonstrates how hand movements could potentially interact with devices. Tapping the index finger to the thumb can simulate pressing a button, or rubbing the two fingers together could increase or decrease a slider. At an I/O conference presentation, Poupyrev demonstrated the technology in action by changing the time setting on his watch simply by rubbing his index finger and thumb together.
In the video, Poupyrev adds that the radar can work through materials and can be embedded in objects. “And what is most exciting about it is you can shrink the entire radar and put it in a tiny chip,” he says. “That’s what makes this approach so promising. It’s extremely reliable. There is nothing to break.”
If successful, the motion controls could have far-reaching possibilities, replacing technology inputs like touchscreens, mice, keyboards and more. At the very least, it addresses the issue of input for very small wearable devices, such as smartwatches.
C.J. Mittica is the editor of Wearables. Contact him at firstname.lastname@example.org and follow him on Twitter at @CJ_Wearables.