File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Self-Paced Collaborative and Adversarial Network for Unsupervised Domain Adaptation

TitleSelf-Paced Collaborative and Adversarial Network for Unsupervised Domain Adaptation
Authors
Keywordsadversarial learning
deep learning
Domain adaptation
self-paced learning
transfer learning
Issue Date2021
Citation
IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021, v. 43, n. 6, p. 2047-2061 How to Cite?
AbstractThis paper proposes a new unsupervised domain adaptation approach called Collaborative and Adversarial Network (CAN), which uses the domain-collaborative and domain-adversarial learning strategies for training the neural network. The domain-collaborative learning strategy aims to learn domain specific feature representation to preserve the discriminability for the target domain, while the domain adversarial learning strategy aims to learn domain invariant feature representation to reduce the domain distribution mismatch between the source and target domains. We show that these two learning strategies can be uniformly formulated as domain classifier learning with positive or negative weights on the losses. We then design a collaborative and adversarial training scheme, which automatically learns domain specific representations from lower blocks in CNNs through collaborative learning and domain invariant representations from higher blocks through adversarial learning. Moreover, to further enhance the discriminability in the target domain, we propose Self-Paced CAN (SPCAN), which progressively selects pseudo-labeled target samples for re-training the classifiers. We employ a self-paced learning strategy such that we can select pseudo-labeled target samples in an easy-to-hard fashion. Additionally, we build upon the popular two-stream approach to extend our domain adaptation approach for more challenging video action recognition task, which additionally considers the cooperation between the RGB stream and the optical flow stream. We propose the Two-stream SPCAN (TS-SPCAN) method to select and reweight the pseudo labeled target samples of one stream (RGB/Flow) based on the information from the other stream (Flow/RGB) in a cooperative way. As a result, our TS-SPCAN model is able to exchange the information between the two streams. Comprehensive experiments on different benchmark datasets, Office-31, ImageCLEF-DA and VISDA-2017 for the object recognition task, and UCF101-10 and HMDB51-10 for the video action recognition task, show our newly proposed approaches achieve the state-of-the-art performance, which clearly demonstrates the effectiveness of our proposed approaches for unsupervised domain adaptation.
Persistent Identifierhttp://hdl.handle.net/10722/322052
ISSN
2021 Impact Factor: 24.314
2020 SCImago Journal Rankings: 3.811
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorZhang, Weichen-
dc.contributor.authorXu, Dong-
dc.contributor.authorOuyang, Wanli-
dc.contributor.authorLi, Wen-
dc.date.accessioned2022-11-03T02:23:16Z-
dc.date.available2022-11-03T02:23:16Z-
dc.date.issued2021-
dc.identifier.citationIEEE Transactions on Pattern Analysis and Machine Intelligence, 2021, v. 43, n. 6, p. 2047-2061-
dc.identifier.issn0162-8828-
dc.identifier.urihttp://hdl.handle.net/10722/322052-
dc.description.abstractThis paper proposes a new unsupervised domain adaptation approach called Collaborative and Adversarial Network (CAN), which uses the domain-collaborative and domain-adversarial learning strategies for training the neural network. The domain-collaborative learning strategy aims to learn domain specific feature representation to preserve the discriminability for the target domain, while the domain adversarial learning strategy aims to learn domain invariant feature representation to reduce the domain distribution mismatch between the source and target domains. We show that these two learning strategies can be uniformly formulated as domain classifier learning with positive or negative weights on the losses. We then design a collaborative and adversarial training scheme, which automatically learns domain specific representations from lower blocks in CNNs through collaborative learning and domain invariant representations from higher blocks through adversarial learning. Moreover, to further enhance the discriminability in the target domain, we propose Self-Paced CAN (SPCAN), which progressively selects pseudo-labeled target samples for re-training the classifiers. We employ a self-paced learning strategy such that we can select pseudo-labeled target samples in an easy-to-hard fashion. Additionally, we build upon the popular two-stream approach to extend our domain adaptation approach for more challenging video action recognition task, which additionally considers the cooperation between the RGB stream and the optical flow stream. We propose the Two-stream SPCAN (TS-SPCAN) method to select and reweight the pseudo labeled target samples of one stream (RGB/Flow) based on the information from the other stream (Flow/RGB) in a cooperative way. As a result, our TS-SPCAN model is able to exchange the information between the two streams. Comprehensive experiments on different benchmark datasets, Office-31, ImageCLEF-DA and VISDA-2017 for the object recognition task, and UCF101-10 and HMDB51-10 for the video action recognition task, show our newly proposed approaches achieve the state-of-the-art performance, which clearly demonstrates the effectiveness of our proposed approaches for unsupervised domain adaptation.-
dc.languageeng-
dc.relation.ispartofIEEE Transactions on Pattern Analysis and Machine Intelligence-
dc.subjectadversarial learning-
dc.subjectdeep learning-
dc.subjectDomain adaptation-
dc.subjectself-paced learning-
dc.subjecttransfer learning-
dc.titleSelf-Paced Collaborative and Adversarial Network for Unsupervised Domain Adaptation-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/TPAMI.2019.2962476-
dc.identifier.pmid31880543-
dc.identifier.scopuseid_2-s2.0-85077269119-
dc.identifier.volume43-
dc.identifier.issue6-
dc.identifier.spage2047-
dc.identifier.epage2061-
dc.identifier.eissn1939-3539-
dc.identifier.isiWOS:000649590200016-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats