Title
Kullback-Leibler divergence for nonnegative matrix factorization
Abstract
The I-divergence or unnormalized generalization of Kullback-Leibler (KL) divergence is commonly used in Nonnegative Matrix Factorization (NMF). This divergence has the drawback that its gradients with respect to the factorizing matrices depend heavily on the scales of the matrices, and learning the scales in gradient-descent optimization may require many iterations. This is often handled by explicit normalization of one of the matrices, but this step may actually increase the I-divergence and is not included in the NMF monotonicity proof. A simple remedy that we study here is to normalize the input data. Such normalization allows the replacement of the I-divergence with the original KL-divergence for NMF and its variants. We show that using KL-divergence takes the normalization structure into account in a very natural way and brings improvements for nonnegative matrix factorizations: the gradients of the normalized KL-divergence are well-scaled and thus lead to a new projected gradient method for NMF which runs faster or yields better approximation than three other widely used NMF algorithms.
Year
DOI
Venue
2011
10.1007/978-3-642-21735-7_31
ICANN (1)
Keywords
Field
DocType
nonnegative matrix factorization,input data,gradient-descent optimization,kullback-leibler divergence,original kl-divergence,nmf monotonicity proof,nmf algorithm,normalized kl-divergence,explicit normalization,factorizing matrix,normalization structure
Applied mathematics,Normalization (statistics),Divergence,Nonnegative matrix,Matrix (mathematics),Artificial intelligence,Gradient method,Discrete mathematics,Monotonic function,Pattern recognition,Non-negative matrix factorization,Mathematics,Kullback–Leibler divergence
Conference
Volume
ISSN
Citations 
6791
0302-9743
7
PageRank 
References 
Authors
0.50
6
4
Name
Order
Citations
PageRank
Zhirong Yang128917.27
He Zhang2676.58
Zhijian Yuan322012.17
Erkki Oja46701797.08