Title
AudioMetro: directing search for sound designers through content-based cues
Abstract
Sound designers source sounds in massive collections, heavily tagged by themselves and sound librarians. For each query, once successive keywords attained a limit to filter down the results, hundreds of sounds are left to be reviewed. AudioMetro combines a new content-based information visualization technique with instant audio feedback to facilitate this part of their workflow. We show through user evaluations by known-item search in collections of textural sounds that a default grid layout ordered by filename unexpectedly outperforms content-based similarity layouts resulting from a recent dimension reduction technique (Student-t Stochastic Neighbor Embedding), even when complemented with content-based glyphs that emphasize local neighborhoods and cue perceptual features. We propose a solution borrowed from image browsing: a proximity grid, whose density we optimize for nearest neighborhood preservation among the closest cells. Not only does it remove overlap but we show through a subsequent user evaluation that it also helps to direct the search. We based our experiments on an open dataset (the OLPC sound library) for replicability.
Year
DOI
Venue
2014
10.1145/2636879.2636880
Audio Mostly Conference
Keywords
Field
DocType
sound effects,design,content-based similarity,experimentation,systems,evaluation/methodology,music information retrieval,graphical user interfaces,media browsers,visual variables,known-item search
Glyph,Music information retrieval,Dimensionality reduction,Embedding,Information visualization,Computer science,Audio feedback,Multimedia,Workflow,Grid
Conference
Citations 
PageRank 
References 
2
0.39
14
Authors
6
Name
Order
Citations
PageRank
Christian Frisson14010.74
Stéphane Dupont213426.78
Willy Yvart322.08
Nicolas Riche4111.64
Xavier Siebert5255.54
Thierry Dutoit61006123.84