ATAP’s Soli & Jacquard

Google’s Advanced Technology and Projects (ATAP) team recently shared a couple research projects that introduce some pretty amazing new methods of interacting with technology.

Two of these projects, presented at Google I/O 2015 buy ATAP’s director Ivan Poupyrev (who I’ve posted about before),  look at the relationship between screen size and the level of detail and control that a user has.

soli_jacquard

Project Soli

The first, Project Soli, uses a tiny radar-based sensor to capture gesture and motion data. Because a Soli sensor is so much more sensitive than a camera, it can detect an incredible level of accuracy. And because it’s so small, you can put the sensors almost anywhere.

Soli

Soli

Radar!

Radar!

This opens up whole new ways of thinking about how we interact with devices. Poupyrev talked about how Soli sensors could be used to enable very detailed interactions with smart watches. And we can now think about how devices without displays could now be incredibly responsive to our movements.

The interaction concept.

The interaction concept.

How the gesture would work in a product.

How the interaction would work in a product.

The project video gives lots more examples, and examines the thinking behind the project in much more depth. Definitely worth the time.

(As a side note, there’s something about this presentation that reminds me of the work that Berg did on the Lamps project for Google. But perhaps it’s just the blankness of the product, or the calmness of the interviews?)

Project Jacquard

The second project, Jacquard, looks how at how to embed sensor technology into textiles (ie. wearables — but really so much more). Their approach is to weave multi-touch panels into fabric — possible because the grid of a fabric is similar to that of a trackpad. It’s an interesting opportunity because the market is so vast — 150 times the size of mobile phones (in units)!

The scale of the garment industry

The scale of the garment industry

So they worked to develop ways that would move this from a novelty, to something that could be incorporated into existing textile manufacturing processes.  And they’ve created fibers that look and feel like traditional fibers, and can withstand the harsh treatment that clothing has to endure. The result is the fabric that supports a wide range of interactions.

jacquard_set

Touch, multi-touch, and just proximity are sensed.

Touch, multi-touch, and just proximity are sensed.

By releasing this into the world fashion designers can invent new uses for the fabric, and explore how people may want to interact with them. (Need ideas of what’s possible? Look at the Fashioning Tech blog.) And it’s not just for fashion – the fabric could be used in furnishings, wall covers, or any number of other uses.

Swipe on the sleeve to make a phone call.

Swipe on the sleeve to make a phone call.

A partnership with Levi's was announced.

A partnership with Levi’s was announced.

The full Jacquard demo — again, definitely worth the time to watch…

My only critique is that the use case demos look pretty traditional… the’ve moved screen gestures to fabric or to above a product. But the gestures and the fundamental interactions remain the same. What we really want to see are new and unexpected things these technologies can enable, and ways we can interact.

But first steps first. Google is releasing this stuff so that designers and developers can invent what’s next. There are lots of exciting and unimagined possibilities. Stay tuned…