File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1016/j.patcog.2021.108337
- Scopus: eid_2-s2.0-85116020582
- WOS: WOS:000704893500002
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Kernelized Support Tensor Train Machines
Title | Kernelized Support Tensor Train Machines |
---|---|
Authors | |
Keywords | Image classification Tensor Support tensor machine |
Issue Date | 2022 |
Publisher | Elsevier BV. The Journal's web site is located at http://www.elsevier.com/locate/pr |
Citation | Pattern Recognition, 2022, v. 122, article no. 108337 How to Cite? |
Abstract | Tensor, a multi-dimensional data structure, has been exploited recently in the machine learning community. Traditional machine learning approaches are vector- or matrix-based, and cannot handle tensorial data directly. In this paper, we propose a tensor train (TT)-based kernel technique for the first time, and apply it to the conventional support vector machine (SVM) for high-dimensional image classification with very small number of training samples. Specifically, we propose a kernelized support tensor train machine that accepts tensorial input and preserves the intrinsic kernel property. The main contributions are threefold. First, we propose a TT-based feature mapping procedure that maintains the TT structure in the feature space. Second, we demonstrate two ways to construct the TT-based kernel function while considering consistency with the TT inner product and preservation of information. Third, we show that it is possible to apply different kernel functions on different data modes. In principle, our method tensorizes the standard SVM on its input structure and kernel mapping scheme. This reduces the storage and computation complexity of kernel matrix construction from exponential to polynomial. The validity proof and computation complexity of the proposed TT-based kernel functions are provided elaborately. Extensive experiments are performed on high-dimensional fMRI and color images datasets, which demonstrates the superiority of the proposed scheme compared with the state-of-the-art techniques. |
Persistent Identifier | http://hdl.handle.net/10722/307868 |
ISSN | 2021 Impact Factor: 8.518 2020 SCImago Journal Rankings: 1.492 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Chen, C | - |
dc.contributor.author | Batselier, K | - |
dc.contributor.author | Yu, W | - |
dc.contributor.author | Wong, N | - |
dc.date.accessioned | 2021-11-12T13:39:04Z | - |
dc.date.available | 2021-11-12T13:39:04Z | - |
dc.date.issued | 2022 | - |
dc.identifier.citation | Pattern Recognition, 2022, v. 122, article no. 108337 | - |
dc.identifier.issn | 0031-3203 | - |
dc.identifier.uri | http://hdl.handle.net/10722/307868 | - |
dc.description.abstract | Tensor, a multi-dimensional data structure, has been exploited recently in the machine learning community. Traditional machine learning approaches are vector- or matrix-based, and cannot handle tensorial data directly. In this paper, we propose a tensor train (TT)-based kernel technique for the first time, and apply it to the conventional support vector machine (SVM) for high-dimensional image classification with very small number of training samples. Specifically, we propose a kernelized support tensor train machine that accepts tensorial input and preserves the intrinsic kernel property. The main contributions are threefold. First, we propose a TT-based feature mapping procedure that maintains the TT structure in the feature space. Second, we demonstrate two ways to construct the TT-based kernel function while considering consistency with the TT inner product and preservation of information. Third, we show that it is possible to apply different kernel functions on different data modes. In principle, our method tensorizes the standard SVM on its input structure and kernel mapping scheme. This reduces the storage and computation complexity of kernel matrix construction from exponential to polynomial. The validity proof and computation complexity of the proposed TT-based kernel functions are provided elaborately. Extensive experiments are performed on high-dimensional fMRI and color images datasets, which demonstrates the superiority of the proposed scheme compared with the state-of-the-art techniques. | - |
dc.language | eng | - |
dc.publisher | Elsevier BV. The Journal's web site is located at http://www.elsevier.com/locate/pr | - |
dc.relation.ispartof | Pattern Recognition | - |
dc.subject | Image classification | - |
dc.subject | Tensor | - |
dc.subject | Support tensor machine | - |
dc.title | Kernelized Support Tensor Train Machines | - |
dc.type | Article | - |
dc.identifier.email | Wong, N: nwong@eee.hku.hk | - |
dc.identifier.authority | Wong, N=rp00190 | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1016/j.patcog.2021.108337 | - |
dc.identifier.scopus | eid_2-s2.0-85116020582 | - |
dc.identifier.hkuros | 329305 | - |
dc.identifier.volume | 122 | - |
dc.identifier.spage | article no. 108337 | - |
dc.identifier.epage | article no. 108337 | - |
dc.identifier.isi | WOS:000704893500002 | - |
dc.publisher.place | Netherlands | - |