File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Using large-scale web data to facilitate textual query based retrieval of consumer photos

TitleUsing large-scale web data to facilitate textual query based retrieval of consumer photos
Authors
KeywordsCross domain learning
Large-scale web data
Textual query based consumer photo retrieval
Issue Date2009
Citation
MM'09 - Proceedings of the 2009 ACM Multimedia Conference, with Co-located Workshops and Symposiums, 2009, p. 55-64 How to Cite?
AbstractThe rapid popularization of digital cameras and mobile phone cameras has lead to an explosive growth of consumer photo collections. In this paper, we present a (quasi) real-time textual query based personal photo retrieval system by leveraging millions of web images and their associated rich textual descriptions (captions, categories, etc.). After a user provides a textual query (e.g., "pool"), our system exploits the inverted file method to automatically find the positive web images that are related to the textual query "pool" as well as the negative web images which are irrelevant to the textual query. Based on these automatically retrieved relevant and irrelevant web images, we employ two simple but effective classification methods, k Nearest Neighbor (kNN) and decision stumps, to rank personal consumer photos. To further improve the photo retrieval performance, we propose three new relevance feedback methods via cross-domain learning. These methods effectively utilize both the web images and the consumer images. In particular, our proposed cross-domain learning methods can learn robust classifiers with only a very limited amount of labeled consumer photos from the user by leveraging the pre-learned decision stumps at interactive response time. Extensive experiments on both consumer and professional stock photo datasets demonstrated the effectiveness and efficiency of our system, which is also inherently not limited by any predefined lexicon. Copyright 2009 ACM.
Persistent Identifierhttp://hdl.handle.net/10722/321394

 

DC FieldValueLanguage
dc.contributor.authorLiu, Yiming-
dc.contributor.authorXu, Dong-
dc.contributor.authorTsang, Ivor W.-
dc.contributor.authorLuo, Jiebo-
dc.date.accessioned2022-11-03T02:18:37Z-
dc.date.available2022-11-03T02:18:37Z-
dc.date.issued2009-
dc.identifier.citationMM'09 - Proceedings of the 2009 ACM Multimedia Conference, with Co-located Workshops and Symposiums, 2009, p. 55-64-
dc.identifier.urihttp://hdl.handle.net/10722/321394-
dc.description.abstractThe rapid popularization of digital cameras and mobile phone cameras has lead to an explosive growth of consumer photo collections. In this paper, we present a (quasi) real-time textual query based personal photo retrieval system by leveraging millions of web images and their associated rich textual descriptions (captions, categories, etc.). After a user provides a textual query (e.g., "pool"), our system exploits the inverted file method to automatically find the positive web images that are related to the textual query "pool" as well as the negative web images which are irrelevant to the textual query. Based on these automatically retrieved relevant and irrelevant web images, we employ two simple but effective classification methods, k Nearest Neighbor (kNN) and decision stumps, to rank personal consumer photos. To further improve the photo retrieval performance, we propose three new relevance feedback methods via cross-domain learning. These methods effectively utilize both the web images and the consumer images. In particular, our proposed cross-domain learning methods can learn robust classifiers with only a very limited amount of labeled consumer photos from the user by leveraging the pre-learned decision stumps at interactive response time. Extensive experiments on both consumer and professional stock photo datasets demonstrated the effectiveness and efficiency of our system, which is also inherently not limited by any predefined lexicon. Copyright 2009 ACM.-
dc.languageeng-
dc.relation.ispartofMM'09 - Proceedings of the 2009 ACM Multimedia Conference, with Co-located Workshops and Symposiums-
dc.subjectCross domain learning-
dc.subjectLarge-scale web data-
dc.subjectTextual query based consumer photo retrieval-
dc.titleUsing large-scale web data to facilitate textual query based retrieval of consumer photos-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1145/1631272.1631283-
dc.identifier.scopuseid_2-s2.0-72549087420-
dc.identifier.spage55-
dc.identifier.epage64-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats