Title
A framework to manage multimodal fusion of events for advanced interactions within virtual environments
Abstract
This paper describes the EVI3d framework, a distributed architecture developed to enhance interactions within Virtual Environments (VE). This framework manages many multi-sensorial devices such as trackers, data gloves, and speech or gesture recognition systems as well as haptic devices. The structure of this architecture allows a complete dispatching of device services and their clients on as many machines as required. With the dated events provided by its time synchronization system, it becomes possible to design a specific module to manage multimodal fusion processes. To this end, we describe how the EVI3d framework manages not only low-level events but also abstract modalities. Moreover, the data flow service of the EVI3d framework solves the problem of sharing the virtual scene between modality modules.
Year
Venue
Field
2002
EGVE
Modalities,BitTorrent tracker,Computer vision,Architecture,Computer science,Time synchronization,Gesture recognition,Human–computer interaction,Artificial intelligence,Haptic technology,Data flow diagram
DocType
ISBN
Citations 
Conference
1-58113-535-1
17
PageRank 
References 
Authors
2.45
14
4
Name
Order
Citations
PageRank
Damien Touraine1525.68
Patrick Bourdot214321.26
Yacine Bellik313220.30
Laurence Bolot4243.37