File Download
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: Recurrence along Depth: Deep Convolutional Neural Networks with Recurrent Layer Aggregation
Title | Recurrence along Depth: Deep Convolutional Neural Networks with Recurrent Layer Aggregation |
---|---|
Authors | |
Keywords | Convolutional Neural Networks ResNet DenseNet Recurrent Structures Layer Aggregation |
Issue Date | 2021 |
Publisher | Neural Information Processing Systems Foundation, Inc. The Journal's web site is located at https://papers.nips.cc/ |
Citation | 35th Conference on Neural Information Processing Systems (NeurIPS), Virtual Conference, 7-10 December 2021. In Ranzato, M ... et al (eds.), Advances in Neural Information Processing Systems 34 (NIPS 2021) pre-proceedings How to Cite? |
Abstract | This paper introduces a concept of layer aggregation to describe how information from previous layers can be reused to better extract features at the current layer. While DenseNet is a typical example of the layer aggregation mechanism, its redundancy has been commonly criticized in the literature. This motivates us to propose a very light-weighted module, called recurrent layer aggregation (RLA), by making use of the sequential structure of layers in a deep CNN. Our RLA module is compatible with many mainstream deep CNNs, including ResNets, Xception and MobileNetV2, and its effectiveness is verified by our extensive experiments on image |
Description | Poster Session 5 at Spot F0 in Virtual World |
Persistent Identifier | http://hdl.handle.net/10722/307994 |
ISSN | 2020 SCImago Journal Rankings: 1.399 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Zhao, J | - |
dc.contributor.author | Fang, Y | - |
dc.contributor.author | Li, G | - |
dc.date.accessioned | 2021-11-12T13:40:54Z | - |
dc.date.available | 2021-11-12T13:40:54Z | - |
dc.date.issued | 2021 | - |
dc.identifier.citation | 35th Conference on Neural Information Processing Systems (NeurIPS), Virtual Conference, 7-10 December 2021. In Ranzato, M ... et al (eds.), Advances in Neural Information Processing Systems 34 (NIPS 2021) pre-proceedings | - |
dc.identifier.issn | 1049-5258 | - |
dc.identifier.uri | http://hdl.handle.net/10722/307994 | - |
dc.description | Poster Session 5 at Spot F0 in Virtual World | - |
dc.description.abstract | This paper introduces a concept of layer aggregation to describe how information from previous layers can be reused to better extract features at the current layer. While DenseNet is a typical example of the layer aggregation mechanism, its redundancy has been commonly criticized in the literature. This motivates us to propose a very light-weighted module, called recurrent layer aggregation (RLA), by making use of the sequential structure of layers in a deep CNN. Our RLA module is compatible with many mainstream deep CNNs, including ResNets, Xception and MobileNetV2, and its effectiveness is verified by our extensive experiments on image | - |
dc.language | eng | - |
dc.publisher | Neural Information Processing Systems Foundation, Inc. The Journal's web site is located at https://papers.nips.cc/ | - |
dc.relation.ispartof | 35th Conference on Neural Information Processing Systems (NeurIPS), 2021 | - |
dc.relation.ispartof | Advances in Neural Information Processing Systems 34 (NIPS 2021 Proceedings) | - |
dc.subject | Convolutional Neural Networks | - |
dc.subject | ResNet | - |
dc.subject | DenseNet | - |
dc.subject | Recurrent Structures | - |
dc.subject | Layer Aggregation | - |
dc.title | Recurrence along Depth: Deep Convolutional Neural Networks with Recurrent Layer Aggregation | - |
dc.type | Conference_Paper | - |
dc.identifier.email | Li, G: gdli@hku.hk | - |
dc.identifier.authority | Li, G=rp00738 | - |
dc.description.nature | published_or_final_version | - |
dc.identifier.hkuros | 329473 | - |
dc.publisher.place | United States | - |