Title
Specification of Multimodal Interactions in NCL
Abstract
This paper proposes an approach to integrate multimodal events--both user-generated, e.g., audio recognizer, motion sensors; and user-consumed, e.g., speech synthesizer, haptic synthesizer--into programming languages for the declarative specification of multimedia applications. More precisely, it presents extensions to the NCL (Nested Context Language) multimedia language. NCL is the standard declarative language for the development of interactive applications for Brazilian Digital TV and an ITU-T Recommendation for IPTV services. NCL applications extended with the multimodal features are presented as results. Historically, Human-Computer Interaction research community has been focusing on user-generated modalities, through studies on the user interaction. On the other hand, Multimedia community has been focusing on output modalities, through studies on timing and multimedia processing. The proposals in this paper is an attempt to integrate concepts of both research communities in a unique high-level programming framework, which aims to assist the authoring of multimedia/multimodal applications.
Year
DOI
Venue
2015
10.1145/2820426.2820436
WebMedia
Field
DocType
Citations 
Modalities,Multimodal interaction,Speech synthesis,Computer science,Digital television,IPTV,Declarative programming,Multimedia,Haptic technology,Software framework
Conference
1
PageRank 
References 
Authors
0.38
12
4