Abstract | ||
---|---|---|
This paper describes the implementation of a virtual xylophone. During a setup phase, the program registers a background depth image, generated by a Kinect sensor, and the user interacts with the program to identify the color of tone bars and to select a restricted track space for tracking mallet locations. During a play phase, the program tracks mallet heads by locating pixels that are in front of the pixels registered in the background image. The program can easily be modified to restrict the notes available to the player or to use pentatonic or other musical scales. |
Year | DOI | Venue |
---|---|---|
2016 | 10.1109/ISM.2016.0094 | 2016 IEEE International Symposium on Multimedia (ISM) |
Keywords | Field | DocType |
Kinect,3D tracking,virtual musical instruments,music education | Computer vision,Computer graphics (images),Computer science,Pixel,Artificial intelligence,3d tracking,Mallet,Music education,restrict | Conference |
ISBN | Citations | PageRank |
978-1-5090-4572-3 | 0 | 0.34 |
References | Authors | |
0 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Nikolas Burks | 1 | 0 | 0.34 |
Lloyd Smith | 2 | 0 | 0.34 |
Jamil Saquer | 3 | 0 | 0.34 |