Abstract | ||
---|---|---|
We prove that the evolution of weight vectors in online gradient descent can encode arbitrary polynomial-space computations, even in the special case of soft-margin support vector machines. Our results imply that, under weak complexity-theoretic assumptions, it is impossible to reason efficiently about the fine-grained behavior of online gradient descent. |
Year | Venue | Field |
---|---|---|
2018 | COLT | ENCODE,Mathematical optimization,Gradient descent,Support vector machine,Algorithm,Mathematics,Special case,Computation |
DocType | Volume | Citations |
Journal | abs/1807.01280 | 0 |
PageRank | References | Authors |
0.34 | 10 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Vaggos Chatziafratis | 1 | 9 | 4.23 |
Tim Roughgarden | 2 | 4177 | 353.32 |
Joshua R. Wang | 3 | 69 | 5.83 |