Title
Generalized Alpha-Beta Divergences and Their Application to Robust Nonnegative Matrix Factorization.
Abstract
We propose a class of multiplicative algorithms for Nonnegative Matrix Factorization (NMF) which are robust with respect to noise and outliers. To achieve this, we formulate a new family generalized divergences referred to as the Alpha-Beta-divergences (AB-divergences), which are parameterized by the two tuning parameters, alpha and beta, and smoothly connect the fundamental Alpha-, Beta- and Gamma-divergences. By adjusting these tuning parameters, we show that a wide range of standard and new divergences can be obtained. The corresponding learning algorithms for NMF are shown to integrate and generalize many existing ones, including the Lee-Seung, ISRA (Image Space Reconstruction Algorithm), EMML (Expectation Maximization Maximum Likelihood), Alpha-NMF, and Beta-NMF. Owing to more degrees of freedom in tuning the parameters, the proposed family of AB-multiplicative NMF algorithms is shown to improve robustness with respect to noise and outliers. The analysis illuminates the links of between AB-divergence and other divergences, especially Gamma-and Itakura-Saito divergences.
Year
DOI
Venue
2011
10.3390/e13010134
ENTROPY
Keywords
Field
DocType
nonnegative matrix factorization (NMF),robust multiplicative NMF algorithms,similarity measures,generalized divergences,Alpha-,Beta-,Gamma-divergences,extended Itakura-Saito like divergences,generalized Kullback-Leibler divergence
Parameterized complexity,Multiplicative function,Expectation–maximization algorithm,Outlier,Robustness (computer science),Reconstruction algorithm,Non-negative matrix factorization,Beta (finance),Statistics,Mathematics
Journal
Volume
Issue
Citations 
13
1
74
PageRank 
References 
Authors
3.87
26
3
Name
Order
Citations
PageRank
Andrzej Cichocki15228508.42
Sergio Cruces220619.05
shunichi amari359921269.68