File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
  • Find via Find It@HKUL
Supplementary

Conference Paper: Channel equilibrium networks for learning deep representation

TitleChannel equilibrium networks for learning deep representation
Authors
Issue Date2020
PublisherML Research Press. The Journal's web site is located at http://proceedings.mlr.press/
Citation
Thirty-seventh International Conference on Machine Learning (ICML 2020), Vienna, Austria, 12-18 July 2020. In Proceedings of Machine Learning Research (PMLR), v. 119: Proceedings of ICML 2020, p. 8645-8654 How to Cite?
AbstractConvolutional Neural Networks (CNNs) are typically constructed by stacking multiple building blocks, each of which contains a normalization layer such as batch normalization (BN) and a rectified linear function such as ReLU. However, this work shows that the combination of normalization and rectified linear function leads to inhibited channels, which have small magnitude and contribute little to the learned feature representation, impeding the generalization ability of CNNs. Unlike prior arts that simply removed the inhibited channels, we propose to wake them up'' during training by designing a novel neural building block, termed Channel Equilibrium (CE) block, which enables channels at the same layer to contribute equally to the learned representation. We show that CE is able to prevent inhibited channels both empirically and theoretically. CE has several appealing benefits. (1) It can be integrated into many advanced CNN architectures such as ResNet and MobileNet, outperforming their original networks. (2) CE has an interesting connection with the Nash Equilibrium, a well-known solution of a non-cooperative game. (3) Extensive experiments show that CE achieves state-of-the-art performance on various challenging benchmarks such as ImageNet and COCO.
DescriptionICML 2020 held virtually due to COVID-19
Persistent Identifierhttp://hdl.handle.net/10722/284167
ISSN

 

DC FieldValueLanguage
dc.contributor.authorShao, W-
dc.contributor.authorTang, S-
dc.contributor.authorPan, X-
dc.contributor.authorTan, P-
dc.contributor.authorWang, X-
dc.contributor.authorLuo, P-
dc.date.accessioned2020-07-20T05:56:37Z-
dc.date.available2020-07-20T05:56:37Z-
dc.date.issued2020-
dc.identifier.citationThirty-seventh International Conference on Machine Learning (ICML 2020), Vienna, Austria, 12-18 July 2020. In Proceedings of Machine Learning Research (PMLR), v. 119: Proceedings of ICML 2020, p. 8645-8654-
dc.identifier.issn2640-3498-
dc.identifier.urihttp://hdl.handle.net/10722/284167-
dc.descriptionICML 2020 held virtually due to COVID-19-
dc.description.abstractConvolutional Neural Networks (CNNs) are typically constructed by stacking multiple building blocks, each of which contains a normalization layer such as batch normalization (BN) and a rectified linear function such as ReLU. However, this work shows that the combination of normalization and rectified linear function leads to inhibited channels, which have small magnitude and contribute little to the learned feature representation, impeding the generalization ability of CNNs. Unlike prior arts that simply removed the inhibited channels, we propose to wake them up'' during training by designing a novel neural building block, termed Channel Equilibrium (CE) block, which enables channels at the same layer to contribute equally to the learned representation. We show that CE is able to prevent inhibited channels both empirically and theoretically. CE has several appealing benefits. (1) It can be integrated into many advanced CNN architectures such as ResNet and MobileNet, outperforming their original networks. (2) CE has an interesting connection with the Nash Equilibrium, a well-known solution of a non-cooperative game. (3) Extensive experiments show that CE achieves state-of-the-art performance on various challenging benchmarks such as ImageNet and COCO.-
dc.languageeng-
dc.publisherML Research Press. The Journal's web site is located at http://proceedings.mlr.press/-
dc.relation.ispartofProceedings of Machine Learning Research (PMLR)-
dc.relation.ispartofThe 37th International Conference on Machine Learning (ICML 2020)-
dc.titleChannel equilibrium networks for learning deep representation-
dc.typeConference_Paper-
dc.identifier.emailLuo, P: pluo@hku.hk-
dc.identifier.authorityLuo, P=rp02575-
dc.identifier.hkuros311028-
dc.identifier.volume119: Proceedings of ICML 2020-
dc.identifier.spage8645-
dc.identifier.epage8654-
dc.publisher.placeUnited States-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats