Title | ||
---|---|---|
KKT condition-based smoothing recurrent neural network for nonsmooth nonconvex optimization in compressed sensing |
Abstract | ||
---|---|---|
This work probes into a smoothing recurrent neural network (SRNN) in terms of a smoothing approximation technique and the equivalent version of the Karush–Kuhn–Tucker condition. Such a network is developed to handle the \(L_0\hbox {-norm}\) minimization model originated from compressed sensing, after replacing the model with a nonconvex nonsmooth approximation one. The existence, uniqueness and limit behavior of solutions of the network are well studied by means of some mathematical tools. Multiple kinds of nonconvex approximation functions are examined so as to decide which of them is most suitable for SRNN to address the problem of sparse signal recovery under different kinds of sensing matrices. Comparative experiments have validated that among the chosen approximation functions, transformed L1 function (TL1), logarithm function (Log) and arctangent penalty function are effective for sparse recovery; SRNN-TL1 is robust and insensitive to the coherence of sensing matrix, while it is competitive by comparison against several existing discrete numerical algorithms and neural network methods for compressed sensing problems. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1007/s00521-017-3239-6 | Neural Computing and Applications |
Keywords | Field | DocType |
Compressed sensing, Nonsmooth and nonconvex approximation, Smoothing approximation, Neural networks, KKT condition | Mathematical optimization,Matrix (mathematics),Recurrent neural network,Smoothing,Artificial intelligence,Logarithm,Karush–Kuhn–Tucker conditions,Artificial neural network,Mathematics,Compressed sensing,Machine learning,Penalty method | Journal |
Volume | Issue | ISSN |
31.0 | 7 | 1433-3058 |
Citations | PageRank | References |
1 | 0.35 | 25 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Dan Wang | 1 | 101 | 40.29 |
Zhuhong Zhang | 2 | 186 | 16.41 |