Visual-Haptic Interface to Virtual Environment - Robotics Institute Carnegie Mellon University
Visual-Haptic Interface to Virtual Environment
Project Head: Ralph Hollis

Haptic interfaces have a potential application to training and simulation where kinesthetic sensation plays an important role along with the usual visual input. The visual/haptic combination problem, however, has not been seriously considered. Some systems have a graphics display simply beside the haptic interface resulting in a
“feeling here but looking there” situation. Some skills such as pick-and-place can be regarded as visual-motor skills, where visual stimuli and kinesthetic stimuli are tightly coupled. If a simulation/training system does not provide the proper visual/haptic relationship, the training effort might not accurately reflect the real situation (no skill transfer), or even worse, the training might be counter to the real situation (negative skill transfer).


In our work, we are proposing a new concept of visual/haptic interfaces which we call a “WYSIWYF display.” WYSIWYF means “What You See Is What You Feel”. The proposed concept is a combination of vision-based object registration for the visual interface and encountered-type display for the haptic interface.

current head

current staff

current contact