File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: A support tensor train machine

TitleA support tensor train machine
Authors
Issue Date2019
PublisherIEEE. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000500
Citation
Proceedings of 2019 International Joint Conference on Neural Networks (IJCNN), Budapest, Hungary, 14-19 July 2019 How to Cite?
AbstractThere has been growing interest in extending traditional vector-based machine learning techniques to their tensor forms. Support tensor machine (STM) and support Tucker machine (STuM) are two typical tensor generalization of the conventional support vector machine (SVM). However, the expressive power of STM is restrictive due to its rank-one tensor constraint, and STuM is not scalable because of the exponentially sized Tucker core tensor. To overcome these limitations, we introduce a novel and effective support tensor train machine (STTM) by employing a general and scalable tensor train as the parameter model. Experiments validate and confirm the superiority of the STTM over SVM, STM and STuM.
Persistent Identifierhttp://hdl.handle.net/10722/275280
ISBN

 

DC FieldValueLanguage
dc.contributor.authorChen, C-
dc.contributor.authorBatselier, K-
dc.contributor.authorKo, CY-
dc.contributor.authorWong, N-
dc.date.accessioned2019-09-10T02:39:20Z-
dc.date.available2019-09-10T02:39:20Z-
dc.date.issued2019-
dc.identifier.citationProceedings of 2019 International Joint Conference on Neural Networks (IJCNN), Budapest, Hungary, 14-19 July 2019-
dc.identifier.isbn978-1-7281-1986-1-
dc.identifier.urihttp://hdl.handle.net/10722/275280-
dc.description.abstractThere has been growing interest in extending traditional vector-based machine learning techniques to their tensor forms. Support tensor machine (STM) and support Tucker machine (STuM) are two typical tensor generalization of the conventional support vector machine (SVM). However, the expressive power of STM is restrictive due to its rank-one tensor constraint, and STuM is not scalable because of the exponentially sized Tucker core tensor. To overcome these limitations, we introduce a novel and effective support tensor train machine (STTM) by employing a general and scalable tensor train as the parameter model. Experiments validate and confirm the superiority of the STTM over SVM, STM and STuM.-
dc.languageeng-
dc.publisherIEEE. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000500-
dc.relation.ispartofInternational Joint Conference on Neural Networks (IJCNN)-
dc.rightsInternational Joint Conference on Neural Networks (IJCNN). Copyright © IEEE.-
dc.rights©2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.-
dc.titleA support tensor train machine-
dc.typeConference_Paper-
dc.identifier.emailWong, N: nwong@eee.hku.hk-
dc.identifier.authorityWong, N=rp00190-
dc.identifier.doi10.1109/IJCNN.2019.8851985-
dc.identifier.hkuros304919-
dc.publisher.placeUnited States-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats