File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1016/j.acha.2023.03.004
- Scopus: eid_2-s2.0-85151018864
- Find via

Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Article: Approximation bounds for norm constrained neural networks with applications to regression and GANs
| Title | Approximation bounds for norm constrained neural networks with applications to regression and GANs |
|---|---|
| Authors | |
| Keywords | Approximation theory Deep learning GAN Neural network |
| Issue Date | 2023 |
| Citation | Applied and Computational Harmonic Analysis, 2023, v. 65, p. 249-278 How to Cite? |
| Abstract | This paper studies the approximation capacity of ReLU neural networks with norm constraint on the weights. We prove upper and lower bounds on the approximation error of these networks for smooth function classes. The lower bound is derived through the Rademacher complexity of neural networks, which may be of independent interest. We apply these approximation bounds to analyze the convergences of regression using norm constrained neural networks and distribution estimation by GANs. In particular, we obtain convergence rates for over-parameterized neural networks. It is also shown that GANs can achieve optimal rate of learning probability distributions, when the discriminator is a properly chosen norm constrained neural network. |
| Persistent Identifier | http://hdl.handle.net/10722/363525 |
| ISSN | 2023 Impact Factor: 2.6 2023 SCImago Journal Rankings: 2.231 |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Jiao, Yuling | - |
| dc.contributor.author | Wang, Yang | - |
| dc.contributor.author | Yang, Yunfei | - |
| dc.date.accessioned | 2025-10-10T07:47:33Z | - |
| dc.date.available | 2025-10-10T07:47:33Z | - |
| dc.date.issued | 2023 | - |
| dc.identifier.citation | Applied and Computational Harmonic Analysis, 2023, v. 65, p. 249-278 | - |
| dc.identifier.issn | 1063-5203 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/363525 | - |
| dc.description.abstract | This paper studies the approximation capacity of ReLU neural networks with norm constraint on the weights. We prove upper and lower bounds on the approximation error of these networks for smooth function classes. The lower bound is derived through the Rademacher complexity of neural networks, which may be of independent interest. We apply these approximation bounds to analyze the convergences of regression using norm constrained neural networks and distribution estimation by GANs. In particular, we obtain convergence rates for over-parameterized neural networks. It is also shown that GANs can achieve optimal rate of learning probability distributions, when the discriminator is a properly chosen norm constrained neural network. | - |
| dc.language | eng | - |
| dc.relation.ispartof | Applied and Computational Harmonic Analysis | - |
| dc.subject | Approximation theory | - |
| dc.subject | Deep learning | - |
| dc.subject | GAN | - |
| dc.subject | Neural network | - |
| dc.title | Approximation bounds for norm constrained neural networks with applications to regression and GANs | - |
| dc.type | Article | - |
| dc.description.nature | link_to_subscribed_fulltext | - |
| dc.identifier.doi | 10.1016/j.acha.2023.03.004 | - |
| dc.identifier.scopus | eid_2-s2.0-85151018864 | - |
| dc.identifier.volume | 65 | - |
| dc.identifier.spage | 249 | - |
| dc.identifier.epage | 278 | - |
| dc.identifier.eissn | 1096-603X | - |
