File Download

There are no files associated with this item.

Supplementary

Conference Paper: Cyclemlp: A MLP-like architecture for dense prediction

TitleCyclemlp: A MLP-like architecture for dense prediction
Authors
Issue Date2022
PublisherIEEE.
Citation
10th International Conference on Learning Representations (ICLR) (Virtual), 25-29 April, 2022 How to Cite?
AbstractThis paper presents a simple MLP-like architecture, CycleMLP, which is a versatile backbone for visual recognition and dense predictions. As compared to modern MLP architectures, e.g. , MLP-Mixer (Tolstikhin et al., 2021), ResMLP (Touvron et al., 2021a), and gMLP (Liu et al., 2021a), whose architectures are correlated to image size and thus are infeasible in object detection and segmentation, CycleMLP has two advantages compared to modern approaches. (1) It can cope with various image sizes. (2) It achieves linear computational complexity to image size by using local windows. In contrast, previous MLPs have O(N2) computations due to fully spatial connections. We build a family of models which surpass existing MLPs and even state-of-the-art Transformer-based models, e.g. Swin Transformer (Liu et al., 2021b), while using fewer parameters and FLOPs. We expand the MLP-like models’ applicability, making them a versatile backbone for dense prediction tasks. CycleMLP achieves competitive results on object detection, instance segmentation, and semantic segmentation. In particular, CycleMLP-Tiny outperforms Swin-Tiny by 1.3% mIoU on ADE20K dataset with fewer FLOPs. Moreover, CycleMLP also shows excellent zero-shot robustness on ImageNet-C dataset.
DescriptionOral presentation
Persistent Identifierhttp://hdl.handle.net/10722/315792

 

DC FieldValueLanguage
dc.contributor.authorChen, S-
dc.contributor.authorXie, E-
dc.contributor.authorGe, C-
dc.contributor.authorLiang, DY-
dc.contributor.authorLuo, P-
dc.date.accessioned2022-08-19T09:04:31Z-
dc.date.available2022-08-19T09:04:31Z-
dc.date.issued2022-
dc.identifier.citation10th International Conference on Learning Representations (ICLR) (Virtual), 25-29 April, 2022-
dc.identifier.urihttp://hdl.handle.net/10722/315792-
dc.descriptionOral presentation-
dc.description.abstractThis paper presents a simple MLP-like architecture, CycleMLP, which is a versatile backbone for visual recognition and dense predictions. As compared to modern MLP architectures, e.g. , MLP-Mixer (Tolstikhin et al., 2021), ResMLP (Touvron et al., 2021a), and gMLP (Liu et al., 2021a), whose architectures are correlated to image size and thus are infeasible in object detection and segmentation, CycleMLP has two advantages compared to modern approaches. (1) It can cope with various image sizes. (2) It achieves linear computational complexity to image size by using local windows. In contrast, previous MLPs have O(N2) computations due to fully spatial connections. We build a family of models which surpass existing MLPs and even state-of-the-art Transformer-based models, e.g. Swin Transformer (Liu et al., 2021b), while using fewer parameters and FLOPs. We expand the MLP-like models’ applicability, making them a versatile backbone for dense prediction tasks. CycleMLP achieves competitive results on object detection, instance segmentation, and semantic segmentation. In particular, CycleMLP-Tiny outperforms Swin-Tiny by 1.3% mIoU on ADE20K dataset with fewer FLOPs. Moreover, CycleMLP also shows excellent zero-shot robustness on ImageNet-C dataset.-
dc.languageeng-
dc.publisherIEEE.-
dc.relation.ispartofInternational Conference on Learning Representation (ICLR), Oral-
dc.rights. Copyright © IEEE.-
dc.rights©20xx IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.-
dc.titleCyclemlp: A MLP-like architecture for dense prediction-
dc.typeConference_Paper-
dc.identifier.emailLuo, P: pluo@hku.hk-
dc.identifier.authorityLuo, P=rp02575-
dc.identifier.hkuros335562-
dc.publisher.placeUnited States-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats