Multimedia Center

Predicting Successful Tactile Mapping of Virtual Objects

by Luca Brayda, Claudio Campus, and Monica Gori

Improving spatial ability of blind and visually impaired people is the main target of orientation and mobility (O&M) programs. In this study we use a minimalistic mouse-shaped haptic device to show a new approach aimed at evaluating devices providing tactile representations of virtual objects. We consider psychophysical, behavioural and subjective parameters to clarify under which circumstances mental representations of spaces (cognitive maps) can be efficiently constructed with touch by blindfolded sighted subjects. We study two complementary processes which determine map construction: low-level perception (in a passive stimulation task) and high-level information integration (in an active exploration task). We show that jointly considering a behavioural measure of information acquisition and a subjective measure of cognitive load can give an accurate prediction and a practical interpretation of mapping performance. Our simple TActile MOuse (TAMO) uses haptics to assess spatial ability: this may help individuals who are blind or visually impaired to be better evaluated by O&M practitioners or to evaluate their own performance.

The full article can be found here:

A Proxy Method for Real-Time 3-DOF Haptic Rendering of Streaming Point Cloud Data

by Fredrik Ryden and Howard Jay Chizeck

This paper presents a new haptic rendering method for streaming point cloud data. It provides haptic rendering of moving physical objects using data obtained from RGB-D cameras. Thus, real-time haptic interaction with moving objects can be achieved using noncontact sensors. This method extends "virtual coupling"-based proxy methods in a way that does not require preprocessing of points and allows for spatial point cloud discontinuities. The key ideas of the algorithm are iterative motion of the proxy with respect to the points, and the use of a variable proxy step size that results in better accuracy for short proxy movements and faster convergence for longer movements. This method provides highly accurate haptic interaction for geometries in which the proxy can physically fit. Another advantage is a significant reduction in the risk of "pop through" during haptic interaction with dynamic point clouds, even in the presence of noise. This haptic rendering method is computationally efficient; it can run in real time on available personal computers without the need for downsampling of point clouds from commercially available depth cameras.

The full article can be found here:

Human Force Discrimination during Active Arm Motion for Force Feedback Design

by Seyedshams Feyzabadi, Sirko Straube, Michele Folgheraiter, Elsa Andrea Kirchner, Su Kyoung Kim, and Jan Christian Albiez

The goal of this study was to analyze the human ability of external force discrimination while actively moving the arm. With the approach presented here, we give an overview for the whole arm of the just-noticeable differences (JNDs) for controlled movements separately executed for the wrist, elbow, and shoulder joints. The work was originally motivated in the design phase of the actuation system of a wearable exoskeleton, which is used in a teleoperation scenario where force feedback should be provided to the subject. The amount of this force feedback has to be calibrated according to the human force discrimination abilities. In the experiments presented here, 10 subjects performed a series of movements facing an opposing force from a commercial haptic interface. Force changes had to be detected in a two-alternative forced choice task. For each of the three joints tested, perceptual thresholds were measured as absolute thresholds (no reference force) and three JNDs corresponding to three reference forces chosen. For this, we used the outcome of the QUEST procedure after 70 trials. Using these four measurements we computed the Weber fraction. Our results demonstrate that different Weber fractions can be measured with respect to the joint. These were 0.11, 0.13, and 0.08 for wrist, elbow, and shoulder, respectively. It is discussed that force perception may be affected by the number of muscles involved and the reproducibility of the movement itself. The minimum perceivable force, on average, was 0.04 N for all three joints.

The full article can be found here:

Wrist Coordination in a Kinematically Redundant Stabilization Task

by Lorenzo Masia, Valentina Squeri, Etienne Burdet, Giulio Sandini, and Pietro Morasso

We investigated how the control of a compliant object is realized by the redundancy of wrist anatomy. Subjects had to balance a one degree-of-freedom inverted pendulum using elastic linkages controlled by wrist flexion/extension (FE) and forearm pronation/supination (PS). Haptic feedback of the interaction forces between the pendulum and the wrist was provided by a robotic interface. By tuning the mechanical properties of the virtual pendulum and the stiffness of the elastic linkages it was possible to study various dynamical regimes of the simulated object. Twenty subjects (divided in two groups) were tested in four days performing the same task but with different presentation order. The stabilization strategy adopted by the subjects was characterized by primarily using the PS DoF when the pendulum was linked to stiff springs and characterized by a relatively fast dynamic response; in contrast, the stabilization task was shared by both DoFs in case of lower spring stiffness and slower dynamics of the virtual object. This video shows a haptic device in use.

The full article can be found here:

Evaluation of Tactile Feedback Methods for Wrist Rotation Guidance

by Andrew A. Stanley and Katherine J. Kuchenbecker

Tactile motion guidance systems aim to direct the user's movement toward a target pose or trajectory by delivering tactile cues through lightweight wearable actuators. This study evaluates 10 forms of tactile feedback for guidance of wrist rotation to understand the traits that influence the effectiveness of such systems. We present five wearable actuators capable of tapping, dragging across, squeezing, twisting, or vibrating against the user's wrist; each actuator can be controlled via steady or pulsing drive algorithms. Ten subjects used each form of feedback to perform three unsighted movement tasks: directional response, position targeting, and trajectory following. The results show that directional responses are fastest when direction is conveyed through the location of the tactile stimulus or steady lateral skin stretch. Feedback that clearly conveys movement direction enables subjects to reach target positions most quickly, though tactile magnitude cues (steady intensity and especially pulsing frequency) can also be used when direction is difficult to discern. Subjects closely tracked arbitrary trajectories only when both movement direction and cue magnitude were subjectively rated as very easy to discern. The best overall performance was achieved by the actuator that repeatedly taps on the subject's wrist on the side toward which they should turn. In the paper, they describe the supplemental material as "Fig. 2 shows isometric drawings of the four custom actuators, and this paper's supplemental video, shows all of the actuators in use.

The full article can be found here:

Easy Links