Welcome to the new Independent website. We hope you enjoy it and we value your feedback. Please contact us here.


New sensation: Disney invents touchscreen that lets users 'feel' images


Scientists at Disney have taken the touchscreen experience to the next level by creating a textured screen that allows viewers to feel videos and images. 

An algorithm developed by Disney Research can be used on touch devices such as desktops, mobile phones and tablets, to simulate 3D features on screen including ridges, edges, protrusions and texture.

The virtual bumps are mapped in a way that controls the friction a user feels as their finger slides across the otherwise smooth flat-screen surface.

Examples of how the technology has been harnessed include users interacting with fossilised bones, a bunch of apples, a map of a mountain, a video stream of a swimming jellyfish, and the contours of a kettle.

Vibrations from devices fool the body by mimicking the stretch and push on skin that would happen when a person encounters an object in real life.

Ivan Poupyrev, the director of Disney Research, Pittsburgh Interaction Group said: “Our brain perceives the 3D bump on a surface mostly from information that it receives via skin stretching. Touch interaction has become the standard for smartphones, tablets and even desktop computers, so designing algorithms that can convert the visual content into believable tactile sensations has immense potential for enriching the user experience”.

Scientists calculated the relationship between the voltage applied to the display and the amount of pressure different touch-screen users apply, and used this to make resistance felt on screen correspond to the sloping of objects on screen. The study shows that viewers are at least three times more likely to prefer a textured screen to models already on the market.

Ali Issrar, the project’s lead researcher and a Disney Research engineer has described how this approach moves away from traditional models where devices use “a library of canned effects” that are played back depending on how a user interacts with a device. “This makes it difficult to create a tactile feedback for dynamic visual content, where the sizes and orientation of features constantly change. With our algorithm we do not have one or two effects, but a set of controls that make it possible to tune tactile effects to a specific visual artifact on the fly.”

The technology will be presented at the ACM Symposium on User Interface Software and Technology that is running from 8 October until 11 October in St Andrews, Scotland.