The first time I experienced a haptic interface was back in 1990, when I got a demo of Margaret Minsky’s Sandpaper system at MIT. Using a force-feedback joystick, users could move the screen’s cursor across a textured area and the joystick would simulate the feeling of resistance and surface. It was pretty cool to see how digital space didn’t need to be disconnected from our senses.
To this day, however, haptics remain fairly niche. Some products incorporate them — such as game controllers and touch screen phones — but they do little more than rumble. So it was interesting to stumble upon some current research and technologies that point to a more engaged future.
Fabian Hemmert’s weight shifting mobiles concept uses a moving weight to change a phone’s center of balance. He gives some pretty interesting examples of how this might be applied to interfaces. For example the weight could reacting to where the user is touching, and his “tactile compass” uses the weight to help communicate direction of travel.
Technology Review recently wrote about how electrovibration can be used to give much more detailed feedback to glass surface touch displays. Using the technology, surfaces can be made to feel as though they are bumpy, rough, sticky, or vibrating. TeslaTouch has a cool demonstration video on their website. And Tapani Ryhänen, Nokia lab director in Cambridge, UK, talks about future uses: “There’s a possibility to use this as a type of communication, so if I do something on my screen, then you can feel it on your screen.”
Lastly, Francesco Cosco shows how haptics can be combined with augmented reality. His demo lets users see and touch objects in a real scene but makes the haptic device invisible, which prevents any visual obtrusion.