File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/ICCV.2019.00316
- Scopus: eid_2-s2.0-85081911144
- WOS: WOS:000531438103022
- Find via
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: Fashion retrieval via graph reasoning networks on a similarity pyramid
Title | Fashion retrieval via graph reasoning networks on a similarity pyramid |
---|---|
Authors | |
Issue Date | 2019 |
Publisher | Institute of Electrical and Electronics Engineers. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000149 |
Citation | Proceedings of IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Korea, 27 October - 2 November 2019, p. 3066-3075 How to Cite? |
Abstract | Matching clothing images from customers and online shopping stores has rich applications in E-commerce. Existing algorithms encoded an image as a global feature vector and performed retrieval with the global representation. However, discriminative local information on clothes are submerged in this global representation, resulting in sub-optimal performance. To address this issue, we propose a novel Graph Reasoning Network (GRNet) on a Similarity Pyramid, which learns similarities between a query and a gallery cloth by using both global and local representations in multiple scales. The similarity pyramid is represented by a Graph of similarity, where nodes represent similarities between clothing components at different scales, and the final matching score is obtained by message passing along edges. In GRNet, graph reasoning is solved by training a graph convolutional network, enabling to align salient clothing components to improve clothing retrieval. To facilitate future researches, we introduce a new benchmark FindFashion, containing rich annotations of bounding boxes, views, occlusions, and cropping. Extensive experiments show that GRNet obtains new state-of-the-art results on two challenging benchmarks, e.g. pushing the top-1, top-20, and top-50 accuracies on DeepFashion to 26%, 64%, and 75% (i.e. 4%, 10%, and 10% absolute improvements), outperforming competitors with large margins. On FindFashion, GRNet achieves considerable improvements on all empirical settings. |
Persistent Identifier | http://hdl.handle.net/10722/284261 |
ISSN | 2023 SCImago Journal Rankings: 12.263 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Kuang, Z | - |
dc.contributor.author | Gao, Y | - |
dc.contributor.author | Li, G | - |
dc.contributor.author | Luo, P | - |
dc.contributor.author | Chen, Y | - |
dc.contributor.author | Lin, L | - |
dc.contributor.author | Zhang, W | - |
dc.date.accessioned | 2020-07-20T05:57:20Z | - |
dc.date.available | 2020-07-20T05:57:20Z | - |
dc.date.issued | 2019 | - |
dc.identifier.citation | Proceedings of IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Korea, 27 October - 2 November 2019, p. 3066-3075 | - |
dc.identifier.issn | 1550-5499 | - |
dc.identifier.uri | http://hdl.handle.net/10722/284261 | - |
dc.description.abstract | Matching clothing images from customers and online shopping stores has rich applications in E-commerce. Existing algorithms encoded an image as a global feature vector and performed retrieval with the global representation. However, discriminative local information on clothes are submerged in this global representation, resulting in sub-optimal performance. To address this issue, we propose a novel Graph Reasoning Network (GRNet) on a Similarity Pyramid, which learns similarities between a query and a gallery cloth by using both global and local representations in multiple scales. The similarity pyramid is represented by a Graph of similarity, where nodes represent similarities between clothing components at different scales, and the final matching score is obtained by message passing along edges. In GRNet, graph reasoning is solved by training a graph convolutional network, enabling to align salient clothing components to improve clothing retrieval. To facilitate future researches, we introduce a new benchmark FindFashion, containing rich annotations of bounding boxes, views, occlusions, and cropping. Extensive experiments show that GRNet obtains new state-of-the-art results on two challenging benchmarks, e.g. pushing the top-1, top-20, and top-50 accuracies on DeepFashion to 26%, 64%, and 75% (i.e. 4%, 10%, and 10% absolute improvements), outperforming competitors with large margins. On FindFashion, GRNet achieves considerable improvements on all empirical settings. | - |
dc.language | eng | - |
dc.publisher | Institute of Electrical and Electronics Engineers. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000149 | - |
dc.relation.ispartof | IEEE International Conference on Computer Vision (ICCV) Proceedings | - |
dc.rights | IEEE International Conference on Computer Vision (ICCV) Proceedings. Copyright © Institute of Electrical and Electronics Engineers. | - |
dc.rights | ©2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | - |
dc.title | Fashion retrieval via graph reasoning networks on a similarity pyramid | - |
dc.type | Conference_Paper | - |
dc.identifier.email | Luo, P: pluo@hku.hk | - |
dc.identifier.authority | Luo, P=rp02575 | - |
dc.identifier.doi | 10.1109/ICCV.2019.00316 | - |
dc.identifier.scopus | eid_2-s2.0-85081911144 | - |
dc.identifier.hkuros | 311015 | - |
dc.identifier.spage | 3066 | - |
dc.identifier.epage | 3075 | - |
dc.identifier.isi | WOS:000531438103022 | - |
dc.publisher.place | United States | - |
dc.identifier.issnl | 1550-5499 | - |