File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Domain adaptation from multiple sources: A domain-dependent regularization approach

TitleDomain adaptation from multiple sources: A domain-dependent regularization approach
Authors
KeywordsDomain adaptation machine
domain-dependent regularizer
multiple source domain adaptation
Issue Date2012
Citation
IEEE Transactions on Neural Networks and Learning Systems, 2012, v. 23, n. 3, p. 504-518 How to Cite?
AbstractIn this paper, we propose a new framework called domain adaptation machine (DAM) for the multiple source domain adaption problem. Under this framework, we learn a robust decision function (referred to as target classifier) for label prediction of instances from the target domain by leveraging a set of base classifiers which are prelearned by using labeled instances either from the source domains or from the source domains and the target domain. With the base classifiers, we propose a new domain-dependent regularizer based on smoothness assumption, which enforces that the target classifier shares similar decision values with the relevant base classifiers on the unlabeled instances from the target domain. This newly proposed regularizer can be readily incorporated into many kernel methods (e.g., support vector machines (SVM), support vector regression, and least-squares SVM (LS-SVM)). For domain adaptation, we also develop two new domain adaptation methods referred to as FastDAM and UniverDAM. In FastDAM, we introduce our proposed domain-dependent regularizer into LS-SVM as well as employ a sparsity regularizer to learn a sparse target classifier with the support vectors only from the target domain, which thus makes the label prediction on any test instance very fast. In UniverDAM, we additionally make use of the instances from the source domains as Universum to further enhance the generalization ability of the target classifier. We evaluate our two methods on the challenging TRECIVD 2005 dataset for the large-scale video concept detection task as well as on the 20 newsgroups and email spam datasets for document retrieval. Comprehensive experiments demonstrate that FastDAM and UniverDAM outperform the existing multiple source domain adaptation methods for the two applications. © 2012 IEEE.
Persistent Identifierhttp://hdl.handle.net/10722/321465
ISSN
2023 Impact Factor: 10.2
2023 SCImago Journal Rankings: 4.170
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorDuan, Lixin-
dc.contributor.authorXu, Dong-
dc.contributor.authorTsang, Ivor Wai Hung-
dc.date.accessioned2022-11-03T02:19:06Z-
dc.date.available2022-11-03T02:19:06Z-
dc.date.issued2012-
dc.identifier.citationIEEE Transactions on Neural Networks and Learning Systems, 2012, v. 23, n. 3, p. 504-518-
dc.identifier.issn2162-237X-
dc.identifier.urihttp://hdl.handle.net/10722/321465-
dc.description.abstractIn this paper, we propose a new framework called domain adaptation machine (DAM) for the multiple source domain adaption problem. Under this framework, we learn a robust decision function (referred to as target classifier) for label prediction of instances from the target domain by leveraging a set of base classifiers which are prelearned by using labeled instances either from the source domains or from the source domains and the target domain. With the base classifiers, we propose a new domain-dependent regularizer based on smoothness assumption, which enforces that the target classifier shares similar decision values with the relevant base classifiers on the unlabeled instances from the target domain. This newly proposed regularizer can be readily incorporated into many kernel methods (e.g., support vector machines (SVM), support vector regression, and least-squares SVM (LS-SVM)). For domain adaptation, we also develop two new domain adaptation methods referred to as FastDAM and UniverDAM. In FastDAM, we introduce our proposed domain-dependent regularizer into LS-SVM as well as employ a sparsity regularizer to learn a sparse target classifier with the support vectors only from the target domain, which thus makes the label prediction on any test instance very fast. In UniverDAM, we additionally make use of the instances from the source domains as Universum to further enhance the generalization ability of the target classifier. We evaluate our two methods on the challenging TRECIVD 2005 dataset for the large-scale video concept detection task as well as on the 20 newsgroups and email spam datasets for document retrieval. Comprehensive experiments demonstrate that FastDAM and UniverDAM outperform the existing multiple source domain adaptation methods for the two applications. © 2012 IEEE.-
dc.languageeng-
dc.relation.ispartofIEEE Transactions on Neural Networks and Learning Systems-
dc.subjectDomain adaptation machine-
dc.subjectdomain-dependent regularizer-
dc.subjectmultiple source domain adaptation-
dc.titleDomain adaptation from multiple sources: A domain-dependent regularization approach-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/TNNLS.2011.2178556-
dc.identifier.scopuseid_2-s2.0-84862192949-
dc.identifier.volume23-
dc.identifier.issue3-
dc.identifier.spage504-
dc.identifier.epage518-
dc.identifier.eissn2162-2388-
dc.identifier.isiWOS:000302705100011-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats