File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Block Proposal Neural Architecture Search

TitleBlock Proposal Neural Architecture Search
Authors
Issue Date2021
Citation
IEEE transactions on image processing : a publication of the IEEE Signal Processing Society, 2021, v. 30, p. 15-25 How to Cite?
AbstractThe existing neural architecture search (NAS) methods usually restrict the search space to the pre-defined types of block for a fixed macro-architecture. However, this strategy will limit the search space and affect architecture flexibility if block proposal search (BPS) is not considered for NAS. As a result, block structure search is the bottleneck in many previous NAS works. In this work, we propose a new evolutionary algorithm referred to as latency EvoNAS (LEvoNAS) for block structure search, and also incorporate it to the NAS framework by developing a novel two-stage framework referred to as Block Proposal NAS (BP-NAS). Comprehensive experimental results on two computer vision tasks demonstrate the superiority of our newly proposed approach over the state-of-the-art lightweight methods. For the classification task on the ImageNet dataset, our BPN-A is better than 1.0-MobileNetV2 with similar latency, and our BPN-B saves 23.7% latency when compared with 1.4-MobileNetV2 with higher top-1 accuracy. Furthermore, for the object detection task on the COCO dataset, our method achieves significant performance improvement than MobileNetV2, which demonstrates the generalization capability of our newly proposed framework.
Persistent Identifierhttp://hdl.handle.net/10722/321911
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorLiu, Jiaheng-
dc.contributor.authorZhou, Shunfeng-
dc.contributor.authorWu, Yichao-
dc.contributor.authorChen, Ken-
dc.contributor.authorOuyang, Wanli-
dc.contributor.authorXu, Dong-
dc.date.accessioned2022-11-03T02:22:17Z-
dc.date.available2022-11-03T02:22:17Z-
dc.date.issued2021-
dc.identifier.citationIEEE transactions on image processing : a publication of the IEEE Signal Processing Society, 2021, v. 30, p. 15-25-
dc.identifier.urihttp://hdl.handle.net/10722/321911-
dc.description.abstractThe existing neural architecture search (NAS) methods usually restrict the search space to the pre-defined types of block for a fixed macro-architecture. However, this strategy will limit the search space and affect architecture flexibility if block proposal search (BPS) is not considered for NAS. As a result, block structure search is the bottleneck in many previous NAS works. In this work, we propose a new evolutionary algorithm referred to as latency EvoNAS (LEvoNAS) for block structure search, and also incorporate it to the NAS framework by developing a novel two-stage framework referred to as Block Proposal NAS (BP-NAS). Comprehensive experimental results on two computer vision tasks demonstrate the superiority of our newly proposed approach over the state-of-the-art lightweight methods. For the classification task on the ImageNet dataset, our BPN-A is better than 1.0-MobileNetV2 with similar latency, and our BPN-B saves 23.7% latency when compared with 1.4-MobileNetV2 with higher top-1 accuracy. Furthermore, for the object detection task on the COCO dataset, our method achieves significant performance improvement than MobileNetV2, which demonstrates the generalization capability of our newly proposed framework.-
dc.languageeng-
dc.relation.ispartofIEEE transactions on image processing : a publication of the IEEE Signal Processing Society-
dc.titleBlock Proposal Neural Architecture Search-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/TIP.2020.3028288-
dc.identifier.pmid33035163-
dc.identifier.scopuseid_2-s2.0-85096457015-
dc.identifier.volume30-
dc.identifier.spage15-
dc.identifier.epage25-
dc.identifier.eissn1941-0042-
dc.identifier.isiWOS:000591830600002-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats