File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1007/978-3-030-01225-0_29
- Scopus: eid_2-s2.0-85055425029
- WOS: WOS:000594212900029
- Find via
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: Two at Once: Enhancing Learning and Generalization Capacities via IBN-Net
Title | Two at Once: Enhancing Learning and Generalization Capacities via IBN-Net |
---|---|
Authors | |
Keywords | Generalization CNNs Invariance Instance normalization |
Issue Date | 2018 |
Citation | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2018, v. 11208 LNCS, p. 484-500 How to Cite? |
Abstract | © 2018, Springer Nature Switzerland AG. Convolutional neural networks (CNNs) have achieved great successes in many computer vision problems. Unlike existing works that designed CNN architectures to improve performance on a single task of a single domain and not generalizable, we present IBN-Net, a novel convolutional architecture, which remarkably enhances a CNN’s modeling ability on one domain (e.g. Cityscapes) as well as its generalization capacity on another domain (e.g. GTA5) without finetuning. IBN-Net carefully integrates Instance Normalization (IN) and Batch Normalization (BN) as building blocks, and can be wrapped into many advanced deep networks to improve their performances. This work has three key contributions. (1) By delving into IN and BN, we disclose that IN learns features that are invariant to appearance changes, such as colors, styles, and virtuality/reality, while BN is essential for preserving content related information. (2) IBN-Net can be applied to many advanced deep architectures, such as DenseNet, ResNet, ResNeXt, and SENet, and consistently improve their performance without increasing computational cost. (3) When applying the trained networks to new domains, e.g. from GTA5 to Cityscapes, IBN-Net achieves comparable improvements as domain adaptation methods, even without using data from the target domain. With IBN-Net, we won the 1st place on the WAD 2018 Challenge Drivable Area track, with an mIoU of 86.18%. |
Persistent Identifier | http://hdl.handle.net/10722/273643 |
ISSN | 2023 SCImago Journal Rankings: 0.606 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Pan, Xingang | - |
dc.contributor.author | Luo, Ping | - |
dc.contributor.author | Shi, Jianping | - |
dc.contributor.author | Tang, Xiaoou | - |
dc.date.accessioned | 2019-08-12T09:56:14Z | - |
dc.date.available | 2019-08-12T09:56:14Z | - |
dc.date.issued | 2018 | - |
dc.identifier.citation | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2018, v. 11208 LNCS, p. 484-500 | - |
dc.identifier.issn | 0302-9743 | - |
dc.identifier.uri | http://hdl.handle.net/10722/273643 | - |
dc.description.abstract | © 2018, Springer Nature Switzerland AG. Convolutional neural networks (CNNs) have achieved great successes in many computer vision problems. Unlike existing works that designed CNN architectures to improve performance on a single task of a single domain and not generalizable, we present IBN-Net, a novel convolutional architecture, which remarkably enhances a CNN’s modeling ability on one domain (e.g. Cityscapes) as well as its generalization capacity on another domain (e.g. GTA5) without finetuning. IBN-Net carefully integrates Instance Normalization (IN) and Batch Normalization (BN) as building blocks, and can be wrapped into many advanced deep networks to improve their performances. This work has three key contributions. (1) By delving into IN and BN, we disclose that IN learns features that are invariant to appearance changes, such as colors, styles, and virtuality/reality, while BN is essential for preserving content related information. (2) IBN-Net can be applied to many advanced deep architectures, such as DenseNet, ResNet, ResNeXt, and SENet, and consistently improve their performance without increasing computational cost. (3) When applying the trained networks to new domains, e.g. from GTA5 to Cityscapes, IBN-Net achieves comparable improvements as domain adaptation methods, even without using data from the target domain. With IBN-Net, we won the 1st place on the WAD 2018 Challenge Drivable Area track, with an mIoU of 86.18%. | - |
dc.language | eng | - |
dc.relation.ispartof | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | - |
dc.subject | Generalization | - |
dc.subject | CNNs | - |
dc.subject | Invariance | - |
dc.subject | Instance normalization | - |
dc.title | Two at Once: Enhancing Learning and Generalization Capacities via IBN-Net | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1007/978-3-030-01225-0_29 | - |
dc.identifier.scopus | eid_2-s2.0-85055425029 | - |
dc.identifier.volume | 11208 LNCS | - |
dc.identifier.spage | 484 | - |
dc.identifier.epage | 500 | - |
dc.identifier.eissn | 1611-3349 | - |
dc.identifier.isi | WOS:000594212900029 | - |
dc.identifier.issnl | 0302-9743 | - |