Title
A saliency-based approach to audio event detection and summarization
Abstract
In this paper, we approach the problem of audio summarization by saliency computation of audio streams, exploring the potential of a modulation model for the detection of perceptually important audio events based on saliency models, along with various fusion schemes for their combination. The fusion schemes include linear, adaptive and nonlinear methods. A machine learning approach, where training of the features is performed, was also applied for the purpose of comparison with the proposed technique. For the evaluation of the algorithm we use audio data taken from movies and we show that nonlinear fusion schemes perform best. The results are reported on the MovSum database, using objective evaluations (against ground-truth denoting the perceptually important audio events). Analysis of the selected audio segments is also performed against a labeled database in respect to audio categories, while a method for fine-tuning of the selected audio events is proposed.
Year
Venue
Keywords
2012
Signal Processing Conference
audio databases,audio streaming,classification,information retrieval,learning (artificial intelligence),modulation,sensor fusion,MovSum database,adaptive method,audio categories,audio data,audio event detection,audio segment selection,audio streams,audio summarization problem,labeled database,linear method,machine learning,modulation model,nonlinear fusion schemes,nonlinear methods,saliency-based approach,audio summarization,modulation model,monomodal audio saliency
Field
DocType
ISSN
Automatic summarization,Nonlinear system,Speech coding,Pattern recognition,Salience (neuroscience),Audio mining,Computer science,Nonlinear methods,Speech recognition,Sensor fusion,Artificial intelligence,Computation
Conference
2219-5491
ISBN
Citations 
PageRank 
978-1-4673-1068-0
5
0.47
References 
Authors
3
4
Name
Order
Citations
PageRank
Athanasia Zlatintsi182.57
Petros Maragos23733591.97
Alexandros Potamianos31443149.05
G. Evangelopoulos429417.67