File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Norm-Based Generalisation Bounds for Deep Multi-Class Convolutional Neural Networks
Title | Norm-Based Generalisation Bounds for Deep Multi-Class Convolutional Neural Networks |
---|---|
Authors | |
Issue Date | 2021 |
Citation | 35th AAAI Conference on Artificial Intelligence, AAAI 2021, 2021, v. 9B, p. 8279-8287 How to Cite? |
Abstract | We show generalisation error bounds for deep learning with two main improvements over the state of the art. (1) Our bounds have no explicit dependence on the number of classes except for logarithmic factors. This holds even when formulating the bounds in terms of the Frobenius-norm of the weight matrices, where previous bounds exhibit at least a square-root dependence on the number of classes. (2) We adapt the classic Rademacher analysis of DNNs to incorporate weight sharing—a task of fundamental theoretical importance which was previously attempted only under very restrictive assumptions. In our results, each convolutional filter contributes only once to the bound, regardless of how many times it is applied. Further improvements exploiting pooling and sparse connections are provided. The presented bounds scale as the norms of the parameter matrices, rather than the number of parameters. In particular, contrary to bounds based on parameter counting, they are asymptotically tight (up to log factors) when the weights approach initialisation, making them suitable as a basic ingredient in bounds sensitive to the optimisation procedure. We also show how to adapt the recent technique of loss function augmentation to replace spectral norms by empirical analogues whilst maintaining the advantages of our approach. |
Persistent Identifier | http://hdl.handle.net/10722/329780 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Ledent, Antoine | - |
dc.contributor.author | Mustafa, Waleed | - |
dc.contributor.author | Lei, Yunwen | - |
dc.contributor.author | Kloft, Marius | - |
dc.date.accessioned | 2023-08-09T03:35:17Z | - |
dc.date.available | 2023-08-09T03:35:17Z | - |
dc.date.issued | 2021 | - |
dc.identifier.citation | 35th AAAI Conference on Artificial Intelligence, AAAI 2021, 2021, v. 9B, p. 8279-8287 | - |
dc.identifier.uri | http://hdl.handle.net/10722/329780 | - |
dc.description.abstract | We show generalisation error bounds for deep learning with two main improvements over the state of the art. (1) Our bounds have no explicit dependence on the number of classes except for logarithmic factors. This holds even when formulating the bounds in terms of the Frobenius-norm of the weight matrices, where previous bounds exhibit at least a square-root dependence on the number of classes. (2) We adapt the classic Rademacher analysis of DNNs to incorporate weight sharing—a task of fundamental theoretical importance which was previously attempted only under very restrictive assumptions. In our results, each convolutional filter contributes only once to the bound, regardless of how many times it is applied. Further improvements exploiting pooling and sparse connections are provided. The presented bounds scale as the norms of the parameter matrices, rather than the number of parameters. In particular, contrary to bounds based on parameter counting, they are asymptotically tight (up to log factors) when the weights approach initialisation, making them suitable as a basic ingredient in bounds sensitive to the optimisation procedure. We also show how to adapt the recent technique of loss function augmentation to replace spectral norms by empirical analogues whilst maintaining the advantages of our approach. | - |
dc.language | eng | - |
dc.relation.ispartof | 35th AAAI Conference on Artificial Intelligence, AAAI 2021 | - |
dc.title | Norm-Based Generalisation Bounds for Deep Multi-Class Convolutional Neural Networks | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.scopus | eid_2-s2.0-85125029441 | - |
dc.identifier.volume | 9B | - |
dc.identifier.spage | 8279 | - |
dc.identifier.epage | 8287 | - |