File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Generalization and equilibrium in generative adversarial nets (GANs)
Title | Generalization and equilibrium in generative adversarial nets (GANs) |
---|---|
Authors | |
Issue Date | 2017 |
Citation | 34th International Conference on Machine Learning, ICML 2017, 2017, v. 1, p. 322-349 How to Cite? |
Abstract | Generalization is defined training of generative adversarial network (GAN), and it's shown that generalization is not guaranteed for the popular distances between distributions such as Jensen-Shannon or Wasserstein. In particular, training may appear to be successful and yet the trained distribution may be arbitrarily far from the target distribution in standard metrics. It is shown that generalization does occur for a much weaker metric we call neural net distance. It is also shown that an approximate pure equilibrium exists in the discriminator/generator game for a natural training objective (Wasserstein) when generator capacity and training set sizes are moderate. Finally, the above theoretical ideas suggest a new training protocol, mix+GAN, which can be combined with any existing method, and empirically is found to improves some existing GAN protocols out of the box. |
Persistent Identifier | http://hdl.handle.net/10722/341227 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Arora, Sanjeev | - |
dc.contributor.author | Ge, Rong | - |
dc.contributor.author | Liang, Yingyu | - |
dc.contributor.author | Ma, Tengyu | - |
dc.contributor.author | Zhang, Yi | - |
dc.date.accessioned | 2024-03-13T08:41:10Z | - |
dc.date.available | 2024-03-13T08:41:10Z | - |
dc.date.issued | 2017 | - |
dc.identifier.citation | 34th International Conference on Machine Learning, ICML 2017, 2017, v. 1, p. 322-349 | - |
dc.identifier.uri | http://hdl.handle.net/10722/341227 | - |
dc.description.abstract | Generalization is defined training of generative adversarial network (GAN), and it's shown that generalization is not guaranteed for the popular distances between distributions such as Jensen-Shannon or Wasserstein. In particular, training may appear to be successful and yet the trained distribution may be arbitrarily far from the target distribution in standard metrics. It is shown that generalization does occur for a much weaker metric we call neural net distance. It is also shown that an approximate pure equilibrium exists in the discriminator/generator game for a natural training objective (Wasserstein) when generator capacity and training set sizes are moderate. Finally, the above theoretical ideas suggest a new training protocol, mix+GAN, which can be combined with any existing method, and empirically is found to improves some existing GAN protocols out of the box. | - |
dc.language | eng | - |
dc.relation.ispartof | 34th International Conference on Machine Learning, ICML 2017 | - |
dc.title | Generalization and equilibrium in generative adversarial nets (GANs) | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.scopus | eid_2-s2.0-85048679392 | - |
dc.identifier.volume | 1 | - |
dc.identifier.spage | 322 | - |
dc.identifier.epage | 349 | - |