Title
Databionic Visualization of Music Collections According to Perceptual Distance
Abstract
We describe the MusicMiner system for organizing large collections of music with databionic mining techniques. Low level audio features are extracted from the raw au- dio data on short time windows during which the sound is assumed to be stationary. Static and temporal statis- tics were consistently and systematically used for aggre- gation of low level features to form high level features. A supervised feature selection targeted to model percep- tual distance between different sounding music lead to a small set of non-redundant sound features. Clustering and visualization based on these feature vectors can discover emergent structures in collections of music. Visualization based on Emergent Self-Organizing Maps in particular en- ables the unsupervised discovery of timbrally consistent clusters that may or may not correspond to musical genres and artists. We demonstrate the visualizations capabilities of the U-Map, displaying local sound differences based on the new audio features. An intuitive browsing of large music collections is offered based on the paradigm of to- pographic maps. The user can navigate the sound space and interact with the maps to play music or show the con- text of a song.
Year
Venue
Keywords
2005
ISMIR 2013
visualization,music similarity,perception,audio features,clustering,feature selection,feature vector
Field
DocType
Citations 
Feature vector,Feature selection,Topographic map,Visualization,Musical,Computer science,Speech recognition,Raw audio format,Cluster analysis,Perception
Conference
35
PageRank 
References 
Authors
1.54
15
4
Name
Order
Citations
PageRank
Fabian Mörchen137217.94
Alfred Ultsch240351.77
Mario Nöcker3422.53
Christian Stamm4382.12