Abstract | ||
---|---|---|
Information geometry is used to elucidate convex optimization problems under L1 constraint. A convex function induces a Riemannian metric and two dually coupled affine connections in the manifold of parameters of interest. A generalized Pythagorean theorem and projection theorem hold in such a manifold. An extended LARS algorithm, applicable to both under-determined and over-determined cases, is studied and properties of its solution path are given. The algorithm is shown to be a Minkovskian gradient-descent method, which moves in the steepest direction of a target function under the Minkovskian L1 norm. Two dually coupled affine coordinate systems are useful for analyzing the solution path. |
Year | DOI | Venue |
---|---|---|
2013 | 10.1109/JSTSP.2013.2241014 | Selected Topics in Signal Processing, IEEE Journal of |
Keywords | Field | DocType |
geometry,gradient methods,optimisation,signal processing,Information geometry,Minkovskian gradient-descent method,Riemannian metric,convex optimization problems,extended LARS algorithm,generalized Pythagorean theorem,projection theorem,sparse optimization,steepest direction,Extended LARS,L1-constraint,information geometry,sparse convex optimization | Information geometry,Mathematical optimization,Proximal Gradient Methods,Convex function,Proper convex function,Conic optimization,Convex optimization,Danskin's theorem,Convex analysis,Mathematics | Journal |
Volume | Issue | ISSN |
7 | 4 | 1932-4553 |
Citations | PageRank | References |
2 | 0.44 | 3 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
shunichi amari | 1 | 5992 | 1269.68 |
Masahiro Yukawa | 2 | 272 | 30.44 |