Abstract | ||
---|---|---|
Linear regression without correspondences concerns the recovery of a signal in the linear regression setting, where the correspondences between the observations and the linear functionals are unknown. The associated maximum likelihood function is NP-hard to compute when the signal has dimension larger than one. To optimize this objective function we reformulate it as a concave minimization problem, which we solve via branch-and-bound. This is supported by a computable search space to branch, an effective lower bounding scheme via convex envelope minimization and a refined upper bound, all naturally arising from the concave minimization reformulation. The resulting algorithm outperforms state-of-the-art methods for fully shuffled data and remains tractable for up to 8-dimensional signals, an untouched regime in prior work. |
Year | DOI | Venue |
---|---|---|
2020 | 10.1109/LSP.2020.3019693 | IEEE SIGNAL PROCESSING LETTERS |
Keywords | DocType | Volume |
Minimization, Upper bound, Signal processing algorithms, Linear regression, Linear programming, Sensors, Estimation, Linear regression without correspondences, unlabeled sensing, homomorphic sensing, concave minimization, branch-and-bound, linear assignment problem | Journal | 27 |
ISSN | Citations | PageRank |
1070-9908 | 0 | 0.34 |
References | Authors | |
0 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Liangzu Peng | 1 | 1 | 2.04 |
Manolis C. Tsakiris | 2 | 50 | 9.79 |