File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: On the capacity of deep generative networks for approximating distributions

TitleOn the capacity of deep generative networks for approximating distributions
Authors
KeywordsApproximation complexity
Deep ReLU networks
Generative adversarial networks
Maximum mean discrepancy
Wasserstein distance
Issue Date2022
Citation
Neural Networks, 2022, v. 145, p. 144-154 How to Cite?
AbstractWe study the efficacy and efficiency of deep generative networks for approximating probability distributions. We prove that neural networks can transform a low-dimensional source distribution to a distribution that is arbitrarily close to a high-dimensional target distribution, when the closeness is measured by Wasserstein distances and maximum mean discrepancy. Upper bounds of the approximation error are obtained in terms of the width and depth of neural network. Furthermore, it is shown that the approximation error in Wasserstein distance grows at most linearly on the ambient dimension and that the approximation order only depends on the intrinsic dimension of the target distribution. On the contrary, when f-divergences are used as metrics of distributions, the approximation property is different. We show that in order to approximate the target distribution in f-divergences, the dimension of the source distribution cannot be smaller than the intrinsic dimension of the target distribution.
Persistent Identifierhttp://hdl.handle.net/10722/363423
ISSN
2023 Impact Factor: 6.0
2023 SCImago Journal Rankings: 2.605

 

DC FieldValueLanguage
dc.contributor.authorYang, Yunfei-
dc.contributor.authorLi, Zhen-
dc.contributor.authorWang, Yang-
dc.date.accessioned2025-10-10T07:46:45Z-
dc.date.available2025-10-10T07:46:45Z-
dc.date.issued2022-
dc.identifier.citationNeural Networks, 2022, v. 145, p. 144-154-
dc.identifier.issn0893-6080-
dc.identifier.urihttp://hdl.handle.net/10722/363423-
dc.description.abstractWe study the efficacy and efficiency of deep generative networks for approximating probability distributions. We prove that neural networks can transform a low-dimensional source distribution to a distribution that is arbitrarily close to a high-dimensional target distribution, when the closeness is measured by Wasserstein distances and maximum mean discrepancy. Upper bounds of the approximation error are obtained in terms of the width and depth of neural network. Furthermore, it is shown that the approximation error in Wasserstein distance grows at most linearly on the ambient dimension and that the approximation order only depends on the intrinsic dimension of the target distribution. On the contrary, when f-divergences are used as metrics of distributions, the approximation property is different. We show that in order to approximate the target distribution in f-divergences, the dimension of the source distribution cannot be smaller than the intrinsic dimension of the target distribution.-
dc.languageeng-
dc.relation.ispartofNeural Networks-
dc.subjectApproximation complexity-
dc.subjectDeep ReLU networks-
dc.subjectGenerative adversarial networks-
dc.subjectMaximum mean discrepancy-
dc.subjectWasserstein distance-
dc.titleOn the capacity of deep generative networks for approximating distributions-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1016/j.neunet.2021.10.012-
dc.identifier.pmid34749027-
dc.identifier.scopuseid_2-s2.0-85118487937-
dc.identifier.volume145-
dc.identifier.spage144-
dc.identifier.epage154-
dc.identifier.eissn1879-2782-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats