Abstract | ||
---|---|---|
This work presents a new algorithm for dictionary learning. Existing algorithms such as MOD and K-SVD often fail to find the best dictionary because they get trapped in a local minimum. Olshausen and Field's Sparsenet algorithm relies on a fixed step projected gradient descent. With the right step, it can avoid local minima and converge towards the global minimum. The problem then becomes to find the right step size. In this work we provide the expression of the optimal step for the gradient descent but the step we use is twice as large as the optimal step. That large step allows the descent to bypass local minima and yields significantly better results than existing algorithms. The algorithms are compared on synthetic data. Our method outperforms existing algorithms both in approximation quality and in perfect recovery rate if an oracle support for the sparse representation is provided. |
Year | DOI | Venue |
---|---|---|
2012 | 10.1007/978-3-642-28551-6_29 | LVA/ICA |
Keywords | Field | DocType |
right step,large step,optimal step,best dictionary,fixed step,dictionary learning,right step size,sparsenet algorithm,gradient descent,large step gradient descent,sparse representation,local minimum | Stochastic gradient descent,Gradient descent,Dictionary learning,K-SVD,Computer science,Sparse approximation,Oracle,Maxima and minima,Theoretical computer science,Synthetic data | Conference |
Citations | PageRank | References |
11 | 0.76 | 4 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Boris Mailhé | 1 | 103 | 7.22 |
M. D. Plumbley | 2 | 1915 | 202.38 |