File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1016/j.knosys.2020.106132
- Scopus: eid_2-s2.0-85086630326
- WOS: WOS:000552126200022
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Multi-component transfer metric learning for handling unrelated source domain samples
Title | Multi-component transfer metric learning for handling unrelated source domain samples |
---|---|
Authors | |
Keywords | Transfer learning Metric learning Component Mahalanobis distance Weight matrix |
Issue Date | 2020 |
Publisher | Elsevier BV. The Journal's web site is located at http://www.elsevier.com/locate/knosys |
Citation | Knowledge-Based Systems, 2020, v. 203, p. article no. 106132 How to Cite? |
Abstract | Transfer learning (TL) is a machine learning paradigm designed for the problem where the training and test data are from different domains. Existing TL approaches mostly assume that training data from the source domain are collected from multiple views or devices. However, in practical applications, a sample in a target domain often only corresponds to a specific view or device. Without the ability to mitigate the influence of the many unrelated samples, the performance of existing TL approaches may deteriorate for such learning tasks. This problem will be exacerbated if the intrinsic relationships among the source domain samples are unclear. Currently, there is no mechanism for determining the intrinsic characteristics of samples in order to treat them differently during TL. The source domain samples that are not related to the test data not only incur computational overhead, but may result in negative transfer. We propose the multi-component transfer metric learning (MCTML) method to address this challenging research problem. Unlike previous metric-based transfer learning which are only capable of using one metric to transform all the samples, MCTML automatically extracts distinct components from the source domain and learns one metric for each component. For each component, MCTML learns the importance of that component in terms of its predictive power based on the Mahalanobis distance metrics. The optimized combination of components are then used to predict the test data collaboratively. Extensive experiments on public datasets demonstrates its effectiveness in knowledge transfer under this challenging condition. |
Persistent Identifier | http://hdl.handle.net/10722/294308 |
ISSN | 2023 Impact Factor: 7.2 2023 SCImago Journal Rankings: 2.219 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Yi, C | - |
dc.contributor.author | Xu, Y | - |
dc.contributor.author | Yu, H | - |
dc.contributor.author | Yan, Y | - |
dc.contributor.author | Liu, Y | - |
dc.date.accessioned | 2020-11-23T08:29:31Z | - |
dc.date.available | 2020-11-23T08:29:31Z | - |
dc.date.issued | 2020 | - |
dc.identifier.citation | Knowledge-Based Systems, 2020, v. 203, p. article no. 106132 | - |
dc.identifier.issn | 0950-7051 | - |
dc.identifier.uri | http://hdl.handle.net/10722/294308 | - |
dc.description.abstract | Transfer learning (TL) is a machine learning paradigm designed for the problem where the training and test data are from different domains. Existing TL approaches mostly assume that training data from the source domain are collected from multiple views or devices. However, in practical applications, a sample in a target domain often only corresponds to a specific view or device. Without the ability to mitigate the influence of the many unrelated samples, the performance of existing TL approaches may deteriorate for such learning tasks. This problem will be exacerbated if the intrinsic relationships among the source domain samples are unclear. Currently, there is no mechanism for determining the intrinsic characteristics of samples in order to treat them differently during TL. The source domain samples that are not related to the test data not only incur computational overhead, but may result in negative transfer. We propose the multi-component transfer metric learning (MCTML) method to address this challenging research problem. Unlike previous metric-based transfer learning which are only capable of using one metric to transform all the samples, MCTML automatically extracts distinct components from the source domain and learns one metric for each component. For each component, MCTML learns the importance of that component in terms of its predictive power based on the Mahalanobis distance metrics. The optimized combination of components are then used to predict the test data collaboratively. Extensive experiments on public datasets demonstrates its effectiveness in knowledge transfer under this challenging condition. | - |
dc.language | eng | - |
dc.publisher | Elsevier BV. The Journal's web site is located at http://www.elsevier.com/locate/knosys | - |
dc.relation.ispartof | Knowledge-Based Systems | - |
dc.subject | Transfer learning | - |
dc.subject | Metric learning | - |
dc.subject | Component | - |
dc.subject | Mahalanobis distance | - |
dc.subject | Weight matrix | - |
dc.title | Multi-component transfer metric learning for handling unrelated source domain samples | - |
dc.type | Article | - |
dc.identifier.email | Yan, Y: ygyan@hku.hk | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1016/j.knosys.2020.106132 | - |
dc.identifier.scopus | eid_2-s2.0-85086630326 | - |
dc.identifier.hkuros | 319011 | - |
dc.identifier.volume | 203 | - |
dc.identifier.spage | article no. 106132 | - |
dc.identifier.epage | article no. 106132 | - |
dc.identifier.isi | WOS:000552126200022 | - |
dc.publisher.place | Netherlands | - |
dc.identifier.issnl | 0950-7051 | - |