Abstract | ||
---|---|---|
Generative adversarial networks (GANs) are a widely used framework for learning generative models. Wasserstein GANs (WGANs), one of the most successful variants of GANs, require solving a minmax optimization problem to global optimality, but are in practice successfully trained using stochastic gradient descent-ascent. In this paper, we show that, when the generator is a one-layer network, stochastic gradient descent-ascent converges to a global solution with polynomial time and sample complexity. |
Year | Venue | DocType |
---|---|---|
2020 | ICML | Conference |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Qi Lei | 1 | 68 | 11.12 |
Lee, Jason D. | 2 | 711 | 48.29 |
Alexandros G. Dimakis | 3 | 3575 | 206.71 |
Constantinos Daskalakis | 4 | 1179 | 85.25 |