Feel the 'Fabric': An Audio-Haptic Interface
dc.contributor.author | Huang, G. | en_US |
dc.contributor.author | Metaxas, D. | en_US |
dc.contributor.author | Govindaraj, M. | en_US |
dc.contributor.editor | D. Breen and M. Lin | en_US |
dc.date.accessioned | 2014-01-29T06:32:17Z | |
dc.date.available | 2014-01-29T06:32:17Z | |
dc.date.issued | 2003 | en_US |
dc.description.abstract | An objective fabric modeling system should convey not only the visual but also the haptic and audio sensory feedbacks to remote/internet users via an audio-haptic interface. In this paper we develop a fabric surface property modeling system consisting of a stylus based fabric characteristic sound modeling, and an audio-haptic interface. By using a stylus, people can perceive fabrics surface roughness, friction, and softness though not as precisely as with their bare fingers. The audio-haptic interface is intended to simulate the case of "feeling a virtually fixed fabric via a rigid stylus" by using the PHANToM haptic interface. We develop a DFFT based correlation-restoration method to model the surface roughness and friction coefficient of a fabric, and a physically based method to model the sound of a fabric when rubbed by a stylus. The audio-haptic interface, which renders synchronized auditory and haptic stimuli when the virtual stylus rubs on the surface of a virtual fabric, is developed in VC++6.0 by using OpenGL and the PHANToM GHOST SDK. We asked subjects to test our audio-haptic interface and they were able to differentiate the surface properties of virtual fabrics in the correct order. We show that the virtual fabric is a good modeling of the real counterpart. | en_US |
dc.description.seriesinformation | Symposium on Computer Animation | en_US |
dc.identifier.isbn | 1-58113-659-5 | en_US |
dc.identifier.issn | 1727-5288 | en_US |
dc.identifier.uri | https://doi.org/10.2312/SCA03/052-061 | en_US |
dc.publisher | The Eurographics Association | en_US |
dc.title | Feel the 'Fabric': An Audio-Haptic Interface | en_US |