File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: T-IRS: Textual query based image retrieval system for consumer photos

TitleT-IRS: Textual query based image retrieval system for consumer photos
Authors
KeywordsCross domain learning
Text based photo retrieval
Issue Date2009
Citation
MM'09 - Proceedings of the 2009 ACM Multimedia Conference, with Co-located Workshops and Symposiums, 2009, p. 983-984 How to Cite?
AbstractIn this demonstration, we present a (quasi) real-time textual query based image retrieval system (T-IRS) for consumer photos by leveraging millions of web images and their associated rich textual descriptions (captions, categories, etc.). After a user provides a textual query (e.g., "boat"), our system automatically finds the positive web images that are related to the textual query "boat" as well as the negative web images which are irrelevant to the textual query. Based on these automatically retrieved positive and negative web images, we employ the decision stump ensemble classifier to rank personal consumer photos. To further improve the photo retrieval performance, we also develop a novel relevance feedback method, referred to as Cross-Domain Regularized Regression (CDRR), which effectively utilizes both the web images and the consumer images. Our system is inherently not limited by any predefined lexicon.
Persistent Identifierhttp://hdl.handle.net/10722/321393

 

DC FieldValueLanguage
dc.contributor.authorLiu, Yiming-
dc.contributor.authorXu, Dong-
dc.contributor.authorTsang, Ivor W.-
dc.contributor.authorLuo, Jiebo-
dc.date.accessioned2022-11-03T02:18:37Z-
dc.date.available2022-11-03T02:18:37Z-
dc.date.issued2009-
dc.identifier.citationMM'09 - Proceedings of the 2009 ACM Multimedia Conference, with Co-located Workshops and Symposiums, 2009, p. 983-984-
dc.identifier.urihttp://hdl.handle.net/10722/321393-
dc.description.abstractIn this demonstration, we present a (quasi) real-time textual query based image retrieval system (T-IRS) for consumer photos by leveraging millions of web images and their associated rich textual descriptions (captions, categories, etc.). After a user provides a textual query (e.g., "boat"), our system automatically finds the positive web images that are related to the textual query "boat" as well as the negative web images which are irrelevant to the textual query. Based on these automatically retrieved positive and negative web images, we employ the decision stump ensemble classifier to rank personal consumer photos. To further improve the photo retrieval performance, we also develop a novel relevance feedback method, referred to as Cross-Domain Regularized Regression (CDRR), which effectively utilizes both the web images and the consumer images. Our system is inherently not limited by any predefined lexicon.-
dc.languageeng-
dc.relation.ispartofMM'09 - Proceedings of the 2009 ACM Multimedia Conference, with Co-located Workshops and Symposiums-
dc.subjectCross domain learning-
dc.subjectText based photo retrieval-
dc.titleT-IRS: Textual query based image retrieval system for consumer photos-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1145/1631272.1631479-
dc.identifier.scopuseid_2-s2.0-72449210182-
dc.identifier.spage983-
dc.identifier.epage984-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats