Abstract | ||
---|---|---|
We apply both distance-based and kernel-based mutual dependence measures to independent component analysis (ICA), and generalize dCovICA to MDMICA, minimizing empirical dependence measures as an objective function in both deflation and parallel manners. Solving this minimization problem, we introduce Latin hypercube sampling (LHS), and a global optimization method, Bayesian optimization (BO) to improve the initialization of the Newton-type local optimization method. The performance of MDMICA is evaluated in various simulation studies and an image data example. When the ICA model is correct, MDMICA achieves competitive results compared to existing approaches. When the ICA model is misspecified, the estimated independent components are less mutually dependent than the observed components using MDMICA, while the estimated independent components are prone to be even more mutually dependent than the observed components using other approaches. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1109/ICMLA.2019.00107 | ICMLA |
Field | DocType | Citations |
Pattern recognition,Global optimization,Computer science,Bayesian optimization,Algorithm,Artificial intelligence,Independent component analysis,Sampling (statistics),Mutual dependence,Local search (optimization),Initialization,Latin hypercube sampling | Conference | 0 |
PageRank | References | Authors |
0.34 | 0 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Ze Jin | 1 | 3 | 2.09 |
David S. Matteson | 2 | 13 | 5.08 |
Tianrong Zhang | 3 | 0 | 0.34 |