Tactile Rendering of 3D Features on Touch Surfaces

Disney Research, Pittsburgh

Summary

In this project, we develop and apply a tactile rendering algorithm to simulate rich 3D geometric features (such as bumps, ridges, edges, protrusions, texture etc.) on touch screen surfaces. The underlying hypothesis is that when a finger slides on an object then minute surface variations are sensed by friction-sensitive mechanoreceptors in the skin. Thus, modulating the friction forces between the fingertip and the touch surface would create illusion of surface variations. We propose that the perception of a 3D “bump” is created when local gradients of the virtual bump are mapped to lateral friction forces.

To validate our approach, we used an electro-vibration based friction display to modulate the friction forces between the touch surface and the sliding finger. We first determined a psychophysical relationship between the voltage applied to the display and the subjective strength of friction forces, and then used this function to render friction forces directly proportional to the gradient (slope) of the surface being rendered. In a pair-wise comparison study, we showed that users are at least three times more likely to prefer the proposed slope-model than other commonly used models. Our algorithm is concise, light and easily applicable on static images and video streams.
[Press Release]

Research Paper

Tactile Rendering of 3D Features on Touch Surfaces
Kim, S.-C., Israr, A., and Poupyrev, I., Tactile Rendering of 3D Features on Touch Surfaces. In Proc. of UIST’13, ACM.
Paper [PDF, 9.8MB]

Technical Details

Our algorithm has three main steps. 1) Calculate the gradient of the virtual surface we want to render, 2) Determine the “dot product” of the gradient of the virtual surface and velocity of the sliding finger, and 3) Map the dot-product to the voltage using the psychophysical relationship.
RE_diagram

In order to render real object, such as the one shown in (a), we extend our basic algorithm. The input to the algorithm is a depth map of the object either measured using, for example, a Kinect or extracted from a 3D model, see (b). From the depth field, we calculate gradient field (c) and render haptic feedback when the finger moves on the 2D image of the object (d).

RE_diagram

Team and Credits

The Rendering algorithm is being developed at Disney Research Pittsburgh by Seung-Chan Kim and Ali Israr. The project video is created and compiled by Kaitlyn Schwalje.

Contact

Email: israr [at] disneyresearch [dot] com

Gallery

Tactile rendering of 3D features

A 1D Gaussian bump: a basic unit for tactile content generation. The height “H” and width “W” are variable and defined by designers.

Tactile rendering of 3D features

An array of 1D Gaussian bumps are rendered to create DVDs, books and other stack of things.

Tactile rendering of 3D features

(a)

Tactile rendering of 3D features

(b)

The picture on the right (a) is augmented with user defined 2D ellipsoidal bumps (b). Fine details of the picture are rendered by analyzing the gray-scale of the image.

Tactile rendering of 3D features

(a)

Tactile rendering of 3D features

(b)

Depth maps extracted from Kinect like sensors are used to render fine features on visual images that are not touchable nor reachable.

Tactile rendering of 3D features

(a)

Tactile rendering of 3D features

(b)

Data extracted from digital elevation models is augmented on navigation maps to provide elevation and depth information to user’s.

Tactile rendering of 3D features

(a) 3D models.

Tactile rendering of 3D features

(b) Live video stream.

One main feature of our algorithm is that it is light weight and can easily be implemented in real-time. (a) A 3D model of objects can be zoomed and panned in real-time to sense fine edge and protruding features of the object. (b) Similarly, the algorithm is scaled to render fine tactile features on live video stream.

Publications

Copyright Notice

The documents contained in these directories are included by the contributing authors as a means to ensure timely dissemination of scholarly and technical work on a non-commercial basis. Copyright and all rights therein are maintained by the authors or by other copyright holders, notwithstanding that they have offered their works here electronically. It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.