File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Domain transfer multiple kernel learning

TitleDomain transfer multiple kernel learning
Authors
KeywordsCross-domain learning
domain adaptation
multiple kernel learning
support vector machine
transfer learning
Issue Date2012
Citation
IEEE Transactions on Pattern Analysis and Machine Intelligence, 2012, v. 34, n. 3, p. 465-479 How to Cite?
AbstractCross-domain learning methods have shown promising results by leveraging labeled patterns from the auxiliary domain to learn a robust classifier for the target domain which has only a limited number of labeled samples. To cope with the considerable change between feature distributions of different domains, we propose a new cross-domain kernel learning framework into which many existing kernel methods can be readily incorporated. Our framework, referred to as Domain Transfer Multiple Kernel Learning (DTMKL), simultaneously learns a kernel function and a robust classifier by minimizing both the structural risk functional and the distribution mismatch between the labeled and unlabeled samples from the auxiliary and target domains. Under the DTMKL framework, we also propose two novel methods by using SVM and prelearned classifiers, respectively. Comprehensive experiments on three domain adaptation data sets (i.e., TRECVID, 20 Newsgroups, and email spam data sets) demonstrate that DTMKL-based methods outperform existing cross-domain learning and multiple kernel learning methods. © 2012 IEEE.
Persistent Identifierhttp://hdl.handle.net/10722/321476
ISSN
2023 Impact Factor: 20.8
2023 SCImago Journal Rankings: 6.158
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorDuan, Lixin-
dc.contributor.authorTsang, Ivor W.-
dc.contributor.authorXu, Dong-
dc.date.accessioned2022-11-03T02:19:10Z-
dc.date.available2022-11-03T02:19:10Z-
dc.date.issued2012-
dc.identifier.citationIEEE Transactions on Pattern Analysis and Machine Intelligence, 2012, v. 34, n. 3, p. 465-479-
dc.identifier.issn0162-8828-
dc.identifier.urihttp://hdl.handle.net/10722/321476-
dc.description.abstractCross-domain learning methods have shown promising results by leveraging labeled patterns from the auxiliary domain to learn a robust classifier for the target domain which has only a limited number of labeled samples. To cope with the considerable change between feature distributions of different domains, we propose a new cross-domain kernel learning framework into which many existing kernel methods can be readily incorporated. Our framework, referred to as Domain Transfer Multiple Kernel Learning (DTMKL), simultaneously learns a kernel function and a robust classifier by minimizing both the structural risk functional and the distribution mismatch between the labeled and unlabeled samples from the auxiliary and target domains. Under the DTMKL framework, we also propose two novel methods by using SVM and prelearned classifiers, respectively. Comprehensive experiments on three domain adaptation data sets (i.e., TRECVID, 20 Newsgroups, and email spam data sets) demonstrate that DTMKL-based methods outperform existing cross-domain learning and multiple kernel learning methods. © 2012 IEEE.-
dc.languageeng-
dc.relation.ispartofIEEE Transactions on Pattern Analysis and Machine Intelligence-
dc.subjectCross-domain learning-
dc.subjectdomain adaptation-
dc.subjectmultiple kernel learning-
dc.subjectsupport vector machine-
dc.subjecttransfer learning-
dc.titleDomain transfer multiple kernel learning-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/TPAMI.2011.114-
dc.identifier.scopuseid_2-s2.0-84863393661-
dc.identifier.volume34-
dc.identifier.issue3-
dc.identifier.spage465-
dc.identifier.epage479-
dc.identifier.isiWOS:000299381600004-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats