Title
Learning Fast Dictionaries for Sparse Representations Using Low-Rank Tensor Decompositions.
Abstract
A new dictionary learning model is introduced where the dictionary matrix is constrained as a sum of R Kronecker products of K terms. It offers a more compact representation and requires fewer training data than the general dictionary learning model, while generalizing Tucker dictionary learning. The proposed Higher Order Sum of Kroneckers model can be computed by merging dictionary learning approaches with the tensor Canonic Polyadic Decomposition. Experiments on image denoising illustrate the advantages of the proposed approach.
Year
DOI
Venue
2018
10.1007/978-3-319-93764-9_42
Lecture Notes in Computer Science
Keywords
Field
DocType
Kronecker product,Tensor data,Dictionary learning
Training set,Kronecker delta,Kronecker product,Dictionary learning,Tensor,Algebra,Matrix (mathematics),Generalization,Computer science,Merge (version control)
Conference
Volume
ISSN
Citations 
10891
0302-9743
0
PageRank 
References 
Authors
0.34
9
3
Name
Order
Citations
PageRank
Cassio Fraga Dantas132.78
Jeremy E. Cohen2468.34
Rémi Gribonval3120783.59