File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Fashion retrieval via graph reasoning networks on a similarity pyramid

TitleFashion retrieval via graph reasoning networks on a similarity pyramid
Authors
Issue Date2019
PublisherInstitute of Electrical and Electronics Engineers. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000149
Citation
Proceedings of IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Korea, 27 October - 2 November 2019, p. 3066-3075 How to Cite?
AbstractMatching clothing images from customers and online shopping stores has rich applications in E-commerce. Existing algorithms encoded an image as a global feature vector and performed retrieval with the global representation. However, discriminative local information on clothes are submerged in this global representation, resulting in sub-optimal performance. To address this issue, we propose a novel Graph Reasoning Network (GRNet) on a Similarity Pyramid, which learns similarities between a query and a gallery cloth by using both global and local representations in multiple scales. The similarity pyramid is represented by a Graph of similarity, where nodes represent similarities between clothing components at different scales, and the final matching score is obtained by message passing along edges. In GRNet, graph reasoning is solved by training a graph convolutional network, enabling to align salient clothing components to improve clothing retrieval. To facilitate future researches, we introduce a new benchmark FindFashion, containing rich annotations of bounding boxes, views, occlusions, and cropping. Extensive experiments show that GRNet obtains new state-of-the-art results on two challenging benchmarks, e.g. pushing the top-1, top-20, and top-50 accuracies on DeepFashion to 26%, 64%, and 75% (i.e. 4%, 10%, and 10% absolute improvements), outperforming competitors with large margins. On FindFashion, GRNet achieves considerable improvements on all empirical settings.
Persistent Identifierhttp://hdl.handle.net/10722/284261
ISSN
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorKuang, Z-
dc.contributor.authorGao, Y-
dc.contributor.authorLi, G-
dc.contributor.authorLuo, P-
dc.contributor.authorChen, Y-
dc.contributor.authorLin, L-
dc.contributor.authorZhang, W-
dc.date.accessioned2020-07-20T05:57:20Z-
dc.date.available2020-07-20T05:57:20Z-
dc.date.issued2019-
dc.identifier.citationProceedings of IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Korea, 27 October - 2 November 2019, p. 3066-3075-
dc.identifier.issn1550-5499-
dc.identifier.urihttp://hdl.handle.net/10722/284261-
dc.description.abstractMatching clothing images from customers and online shopping stores has rich applications in E-commerce. Existing algorithms encoded an image as a global feature vector and performed retrieval with the global representation. However, discriminative local information on clothes are submerged in this global representation, resulting in sub-optimal performance. To address this issue, we propose a novel Graph Reasoning Network (GRNet) on a Similarity Pyramid, which learns similarities between a query and a gallery cloth by using both global and local representations in multiple scales. The similarity pyramid is represented by a Graph of similarity, where nodes represent similarities between clothing components at different scales, and the final matching score is obtained by message passing along edges. In GRNet, graph reasoning is solved by training a graph convolutional network, enabling to align salient clothing components to improve clothing retrieval. To facilitate future researches, we introduce a new benchmark FindFashion, containing rich annotations of bounding boxes, views, occlusions, and cropping. Extensive experiments show that GRNet obtains new state-of-the-art results on two challenging benchmarks, e.g. pushing the top-1, top-20, and top-50 accuracies on DeepFashion to 26%, 64%, and 75% (i.e. 4%, 10%, and 10% absolute improvements), outperforming competitors with large margins. On FindFashion, GRNet achieves considerable improvements on all empirical settings.-
dc.languageeng-
dc.publisherInstitute of Electrical and Electronics Engineers. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000149-
dc.relation.ispartofIEEE International Conference on Computer Vision (ICCV) Proceedings-
dc.rightsIEEE International Conference on Computer Vision (ICCV) Proceedings. Copyright © Institute of Electrical and Electronics Engineers.-
dc.rights©2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.-
dc.titleFashion retrieval via graph reasoning networks on a similarity pyramid-
dc.typeConference_Paper-
dc.identifier.emailLuo, P: pluo@hku.hk-
dc.identifier.authorityLuo, P=rp02575-
dc.identifier.doi10.1109/ICCV.2019.00316-
dc.identifier.scopuseid_2-s2.0-85081911144-
dc.identifier.hkuros311015-
dc.identifier.spage3066-
dc.identifier.epage3075-
dc.identifier.isiWOS:000531438103022-
dc.publisher.placeUnited States-
dc.identifier.issnl1550-5499-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats