Title
From expressive gesture to sound
Abstract
This paper contributes to the development of a multimodal, musical tool that extends the natural action range of the human body to communicate expressiveness into the virtual music domain. The core of this musical tool consists of a low cost, highly functional computational model developed upon the Max/MSP platform that (1) captures real-time movement of the human body into a 3D coordinate system on the basis of the orientation output of any type of inertial sensor system that is OSC-compatible, (2) extract low-level movement features that specify the amount of contraction/expansion as a measure of how a subject uses the surrounding space, (3) recognizes these movement features as being expressive gestures, and (4) creates a mapping trajectory between these expressive gestures and the sound synthesis process of adding harmonic related voices on an in origin monophonic voice. The concern for a user-oriented and intuitive mapping strategy was thereby of central importance. This was achieved by conducting an empirical experiment based on theoretical concepts from the embodied music cognition paradigm. Based on empirical evidence, this paper proposes a mapping trajectory that facilitates the interaction between a musician and his instrument, the artistic collaboration between (multimedia) artists and the communication of expressiveness in a social, musical context.
Year
DOI
Venue
2010
10.1007/s12193-009-0027-3
J. Multimodal User Interfaces
Keywords
Field
DocType
multimodal interface · mapping · inertial sensing technique · usability testing
Embodied music cognition,Gesture,Computer science,Musical,Usability,Speech recognition,Human–computer interaction,Performing arts,Trajectory,Human body,Cartesian coordinate system
Journal
Volume
Issue
ISSN
3
1
1783-8738
Citations 
PageRank 
References 
2
0.50
8
Authors
5
Name
Order
Citations
PageRank
Pieter-Jan Maes1183.76
Marc Leman2598.26
Micheline Lesaffre313016.49
Michiel Demey4264.47
Dirk Moelants57910.82