Title
AudioHaptics: audio and haptic rendering based on a physical model
Abstract
In this paper, we propose a method for the synthesis of haptic and auditory senses that is based on a physical model called AudioHaptics. We have developed a haptic environment that incorporates auditory sensation. We achieved this by fitting a speaker at the end effecter of a haptic interface. The FEM (Finite Element method) was used to calculate the vibration of a virtual object when an impact is occurred, and the sound pressure data at the speaker position was then calculated based on the 2D complex amplitude of the object surface in real time. The AudioHaptics system can generate sounds originating from virtual objects, which can have arbitrary shapes, attributes and inner structures. Experiments for evaluation with real users demonstrated that this method is effective for rendering audio and haptic sensation.
Year
DOI
Venue
2004
10.1109/HAPTIC.2004.1287203
HAPTICS
Keywords
Field
DocType
finite element method,object surface,physical model,auditory sense,haptic rendering,haptic interface,auditory sensation,haptic sensation,virtual object,real time,audiohaptics system,haptic environment,virtual reality,virtual environment,end effectors,loudspeakers,finite element analysis,reverberation,feedback,finite element methods,shape
Virtual image,Computer vision,Reverberation,Virtual reality,Virtual machine,Computer science,Artificial intelligence,Loudspeaker,Rendering (computer graphics),Haptic technology,Stereotaxy
Conference
ISBN
Citations 
PageRank 
0-7695-2112-6
3
0.61
References 
Authors
5
5
Name
Order
Citations
PageRank
Hiroaki Yano141859.03
Hiromi Igawa230.95
Toshihiro Kameda331.62
Koichi Muzutani430.61
Hiroo Iwata5634150.15