Title | ||
---|---|---|
Supervised nonnegative matrix factorization with Dual-Itakura-Saito and Kullback-Leibler divergences for music transcription. |
Abstract | ||
---|---|---|
In this paper, we present a convex-analytic approach to supervised nonnegative matrix factorization (SNMF) based on the Dual-Itakura-Saito (Dual-IS) and Kullback-Leibler (KL) divergences for music transcription. The Dual-IS and KL divergences define convex fidelity functions, whereas the IS divergence defines a nonconvex one. The SNMF problem is formulated as minimizing the divergence-based fidelity function penalized by the l(1) and row-block l(1) norms subject to the nonnegativity constraint. Simulation results show that (i) the use of the Dual-IS and KL divergences yields better performance than the squared Euclidean distance and that (ii) the use of the Dual-IS divergence prevents from false alarms efficiently. |
Year | Venue | Field |
---|---|---|
2016 | European Signal Processing Conference | Signal processing,Fidelity,Combinatorics,Divergence,Euclidean distance,Regular polygon,Squared euclidean distance,Non-negative matrix factorization,Kullback–Leibler divergence,Mathematics |
DocType | ISSN | Citations |
Conference | 2076-1465 | 0 |
PageRank | References | Authors |
0.34 | 0 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Hideaki Kagami | 1 | 0 | 0.68 |
Masahiro Yukawa | 2 | 272 | 30.44 |