Abstract | ||
---|---|---|
Sparse modeling has demonstrated its superior performances in many applications. Compared to optimization based approaches, Bayesian sparse modeling generally provides a more sparse result with a knowledge of confidence. Using the Spike and Slab priors, we propose the hierarchical sparse models for the scenario of single task and multitask - Hi-BCS and CHi-BCS. We draw the connections of these two methods to their optimization based counterparts and use expectation propagation for inference. The experiment results using synthetic and real data demonstrate that the performance of Hi-BCS and Chi-BCS are comparable or better than their optimization based counterparts. |
Year | DOI | Venue |
---|---|---|
2013 | 10.1109/ICASSP.2013.6638229 | ICASSP |
Keywords | Field | DocType |
hierarchical,optimisation,hi-bcs,collaborative hierarchical bayesian compressive sensing,spike and slab,bayesian sparse modeling,expectation propagation,bayes methods,spike and slab priors,hierarchical bayesian compressive sensing,optimization based approach,compressed sensing,hierarchical sparse modeling,chi-bcs,sparse modeling,dictionaries,cost function,noise | Pattern recognition,Inference,Computer science,Sparse approximation,Slab,Artificial intelligence,Expectation propagation,Prior probability,Compressed sensing,Bayesian probability | Conference |
ISSN | Citations | PageRank |
1520-6149 | 9 | 0.49 |
References | Authors | |
9 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Yuanming Suo | 1 | 75 | 6.73 |
Minh Dao | 2 | 121 | 11.14 |
T. D. Tran | 3 | 179 | 20.04 |
Umamahesh Srinivas | 4 | 118 | 7.66 |
Vishal Monga | 5 | 679 | 57.73 |