File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1016/j.mlwa.2023.100479
- Find via
Supplementary
Article: Multilinear multitask learning by transformed tensor singular value decomposition
Title | Multilinear multitask learning by transformed tensor singular value decomposition |
---|---|
Authors | |
Issue Date | 24-Jun-2023 |
Publisher | Elsevier Ltd. |
Citation | Machine Learning with Applications, 2023, v. 13 How to Cite? |
Abstract | In this paper, we study the problem of multilinear multitask learning (MLMTL), in which all tasks are stacked into a third-order tensor for consideration. In contrast to conventional multitask learning, MLMTL can explore inherent correlations among multiple tasks in a better manner by utilizing multilinear low rank structure. Existing approaches about MLMTL are mainly based on the sum of singular values for approximating low rank matrices obtained by matricizing the third-order tensor. However, these methods are suboptimal in the Tucker rank approximation. In order to elucidate intrinsic correlations among multiple tasks, we present a new approach by the use of transformed tensor nuclear norm (TTNN) constraint in the objective function. The main advantage of the proposed approach is that it can acquire a low transformed multi-rank structure in a transformed tensor by applying suitable unitary transformations which is helpful to determine principal components in grouping multiple tasks for describing their intrinsic correlations more precisely. Furthermore, we establish an excess risk bound of the minimizer of the proposed TTNN approach. Experimental results including synthetic problems and real-world images, show that the mean-square errors of the proposed method is lower than those of the existing methods for different number of tasks and training samples in MLMTL. |
Persistent Identifier | http://hdl.handle.net/10722/331061 |
ISSN |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Zhang, Xiongjun | - |
dc.contributor.author | Wu, Jin | - |
dc.contributor.author | Ng, Michael K | - |
dc.date.accessioned | 2023-09-21T06:52:26Z | - |
dc.date.available | 2023-09-21T06:52:26Z | - |
dc.date.issued | 2023-06-24 | - |
dc.identifier.citation | Machine Learning with Applications, 2023, v. 13 | - |
dc.identifier.issn | 2666-8270 | - |
dc.identifier.uri | http://hdl.handle.net/10722/331061 | - |
dc.description.abstract | <p>In this paper, we study the problem of multilinear <a href="https://www.sciencedirect.com/topics/computer-science/multitask-learning" title="Learn more about multitask learning from ScienceDirect's AI-generated Topic Pages">multitask learning</a> (MLMTL), in which all tasks are stacked into a third-order tensor for consideration. In contrast to conventional multitask learning, MLMTL can explore inherent correlations among multiple tasks in a better manner by utilizing multilinear low rank structure. Existing approaches about MLMTL are mainly based on the sum of <a href="https://www.sciencedirect.com/topics/computer-science/singular-value" title="Learn more about singular values from ScienceDirect's AI-generated Topic Pages">singular values</a> for approximating low rank matrices obtained by matricizing the third-order tensor. However, these methods are suboptimal in the Tucker rank approximation. In order to elucidate intrinsic correlations among multiple tasks, we present a new approach by the use of transformed tensor <a href="https://www.sciencedirect.com/topics/computer-science/nuclear-norm" title="Learn more about nuclear norm from ScienceDirect's AI-generated Topic Pages">nuclear norm</a> (TTNN) constraint in the objective function. The main advantage of the proposed approach is that it can acquire a low transformed multi-rank structure in a transformed tensor by applying suitable <a href="https://www.sciencedirect.com/topics/computer-science/unitary-transformation" title="Learn more about unitary transformations from ScienceDirect's AI-generated Topic Pages">unitary transformations</a> which is helpful to determine principal components in grouping multiple tasks for describing their intrinsic correlations more precisely. Furthermore, we establish an excess risk bound of the minimizer of the proposed TTNN approach. Experimental results including synthetic problems and real-world images, show that the mean-square errors of the proposed method is lower than those of the existing methods for different number of tasks and training samples in MLMTL.<br></p> | - |
dc.language | eng | - |
dc.publisher | Elsevier Ltd. | - |
dc.relation.ispartof | Machine Learning with Applications | - |
dc.title | Multilinear multitask learning by transformed tensor singular value decomposition | - |
dc.type | Article | - |
dc.identifier.doi | 10.1016/j.mlwa.2023.100479 | - |
dc.identifier.volume | 13 | - |
dc.identifier.issnl | 2666-8270 | - |