Abstract | ||
---|---|---|
Recently, label consistent k-svd (LC-KSVD) algorithm has been successfully applied in image classification. The objective function of LC-KSVD is consisted of reconstruction error, classification error and discriminative sparse codes error with ℓ0-norm sparse regularization term. The ℓ0-norm, however, leads to NP-hard problem. Despite some methods such as orthogonal matching pursuit can help solve this problem to some extent, it is quite difficult to find the optimum sparse solution. To overcome this limitation, we propose a method named label embedded dictionary learning (LEDL), which embeds the label information into ℓ1 regularized dictionary learning algorithm to improve the performance of image classification tasks. Specifically, (i) compared to LC-KSVD, we utilise the ℓ1-norm to transfer the sparse constraint problem to convex optimization problem; (ii) alternating direction method of multipliers (ADMM) is adopted to solve the sparse constraint problem to improve the optimization speed; (iii) extensive experimental results on six benchmark datasets illustrate that the classification rate of our proposed algorithm exceeds the LC-KSVD algorithm and our proposed algorithm has achieved state-of-the-art performance. |
Year | DOI | Venue |
---|---|---|
2020 | 10.1016/j.neucom.2019.12.071 | Neurocomputing |
Keywords | Field | DocType |
Dictionary learning,Sparse representation,Label embedded dictionary learning,Image classification | Matching pursuit,Dictionary learning,Pattern recognition,Reconstruction error,Regularization (mathematics),Artificial intelligence,Contextual image classification,Discriminative model,Classification rate,Convex optimization,Mathematics | Journal |
Volume | ISSN | Citations |
385 | 0925-2312 | 1 |
PageRank | References | Authors |
0.36 | 0 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Shuai Shao | 1 | 3 | 2.41 |
Rui Xu | 2 | 3 | 2.75 |
Weifeng Liu | 3 | 87 | 13.82 |
Bao-Di Liu | 4 | 166 | 27.34 |
Yanjiang Wang | 5 | 15 | 8.65 |