Touchscreens offer a simple and intuitive way to let users control smartphones and tablets, but transmit little in terms of the information we normally receive by feeling things. Surface roughness and raised buttons are just two tactile examples that offer people huge amounts of information about a thing being touched in the real world.
Now, researchers are finding new ways to enhance touch-based devices with the missing feeling of touch. They’re working on electric fields to mimic the 3-D feeling of bumps, ridges and edges on flat surfaces, and using sound waves to generate the illusion of touch in midair.
As most people by now know, touchscreens are by nature smooth. The fact that they lack texture makes them stand in stark contrast to buttons, which people can operate by feel alone. To overcome this shortcoming, researchers worldwide are investigating the field of haptics, which looks to imbue devices with the advantages of feel and touch.
Scientists at Disney Research have now shown that they can recreate the feel of different surface features on a smooth object. The secret, detailed on Oct. 11 at the ACM Symposium on User Interface Software and Technology in St. Andrews, Scotland, involves stretching the skin on fingers as they slide on touchscreens.
"This could be on a wall, on flat monitor screens, screens of tablets and phones," says Disney Research’s Ali Israr.
Their work alters the friction that skin feels on surfaces by using electrostatic forces, the same that attracts plastic wrap to a person’s fingers. This involves building up electric charge on the surfaces in question.
When people slide their fingers over real bumps, they perceive the surface feature largely because friction stretches and compresses skin on the finger. By electrically altering the friction a fingertip encounters as it glides across a surface, their technique can fool the brain by creating the illusion of a physical bump.
Then the researchers developed a system that changes the friction a sliding finger experiences to match the tactile properties one might want for images on a touchscreen. It dynamically adjusts these perceptions of touch with changes in the size and placement of images on the display.
"The technology could create content for touchscreens of mobile phones, tablets, surfaces and projections, and the algorithm creates features such as bumps, protrusions, edges, texture and many more on the fly," Israr says.
Instead of changing how touchscreen surfaces feel, scientists at the University of Bristol in England are changing the feel above the displays. Using sound waves, they found they could help people experience sensations above a touchscreen without having to hold or touch it, an approach they call UltraHaptics.
"The most exciting thing about UltraHaptics is the power and the fidelity of the tactile sensations it generates in mid-air," says Sriram Subramanian, a human-computer interaction researcher at the University of Bristol. "You don’t have to wear anything like gloves to experience tactile sensations with UltraHaptics, or hold a controller."
The system involves an array of ultrasound emitters located beneath a video display. When sound waves converge at the same points at the same time, they can exert pressure on bare hands to create sensations on the skin.
"If you blow air through a straw onto your hand, that’s how it feels," Subramanian says.
UltraHaptics can target individual fingers, so people can, say, zoom into an image by pinching two fingers together, or increase the volume of music by moving a hand from left to right. It can also project a layer of data over a display — for instance, the way a population varies in density over an area can be transmitted by different feelings of pressure over a map.
UltraHaptics could find applications “anywhere mid-air gestures are popular now,” Subramanian says. “People talk about Kinect and Leap for mid-air gestures, and this can give an extra dimension of input—touch—for gesture control, laptops, tablet devices, automotive dashboards, gaming.”
When compared with the touchscreen research from Disney, “UltraHaptics can allow you to feel tactile feedback before you touch a screen, but UltraHaptics would find it hard to give tactile feedback when you are touching a screen, which is what Disney can do,” Subramanian says. “I think we complement each other quite well.”
Top Image: Depth maps extracted from Kinect like sensors are used to render fine features on visual images that are not touchable or reachable. Photo courtesy Disney Research.