File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Bayesian Tensor Tucker Completion With a Flexible Core

TitleBayesian Tensor Tucker Completion With a Flexible Core
Authors
KeywordsBayesian Tucker model
Gaussian-Wishart priors
multilinear rank estimation
Tensor decomposition
Issue Date2-Nov-2023
PublisherInstitute of Electrical and Electronics Engineers
Citation
IEEE Transactions on Signal Processing, 2023, v. 71, p. 4077-4091 How to Cite?
Abstract

Tensor completion is a vital task in multi-dimensional signal processing and machine learning. To recover the missing data in a tensor, various low-rank structures of a tensor can be assumed, and Tucker format is a popular choice. However, the promising capability of Tucker completion is realized only when we can determine a suitable multilinear rank, which controls the model complexity and thus is essential to avoid overfitting/underfitting. Rather than exhaustively searching the best multilinear rank, which is computationally inefficient, recent advances have proposed a Bayesian way to learn the multilinear rank from training data automatically. However, in prior arts, only a single parameter is dedicated to learn the variance of the core tensor elements. This rigid assumption restricts the modeling capabilities of existing methods in real-world data, where the core tensor elements may have a wide range of variances. To have a flexible core tensor while still retaining succinct Bayesian modeling, we first bridge the tensor Tucker decomposition to the canonical polyadic decomposition (CPD) with low-rank factor matrices, and then propose a novel Bayesian modeling based on the Gaussian-inverse Wishart prior. Inference algorithm is further derived under the variational inference framework. Extensive numerical studies on synthetic data and real-world datasets demonstrate the significantly improved performance of the proposed algorithm in terms of multilinear rank learning and missing data recovery.


Persistent Identifierhttp://hdl.handle.net/10722/339304
ISSN
2023 Impact Factor: 4.6
2023 SCImago Journal Rankings: 2.520
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorTong, Xueke-
dc.contributor.authorCheng, Lei-
dc.contributor.authorWu, Yik-Chung-
dc.date.accessioned2024-03-11T10:35:33Z-
dc.date.available2024-03-11T10:35:33Z-
dc.date.issued2023-11-02-
dc.identifier.citationIEEE Transactions on Signal Processing, 2023, v. 71, p. 4077-4091-
dc.identifier.issn1053-587X-
dc.identifier.urihttp://hdl.handle.net/10722/339304-
dc.description.abstract<p>Tensor completion is a vital task in multi-dimensional signal processing and machine learning. To recover the missing data in a tensor, various low-rank structures of a tensor can be assumed, and Tucker format is a popular choice. However, the promising capability of Tucker completion is realized only when we can determine a suitable multilinear rank, which controls the model complexity and thus is essential to avoid overfitting/underfitting. Rather than exhaustively searching the best multilinear rank, which is computationally inefficient, recent advances have proposed a Bayesian way to learn the multilinear rank from training data automatically. However, in prior arts, only a single parameter is dedicated to learn the variance of the core tensor elements. This rigid assumption restricts the modeling capabilities of existing methods in real-world data, where the core tensor elements may have a wide range of variances. To have a flexible core tensor while still retaining succinct Bayesian modeling, we first bridge the tensor Tucker decomposition to the canonical polyadic decomposition (CPD) with low-rank factor matrices, and then propose a novel Bayesian modeling based on the Gaussian-inverse Wishart prior. Inference algorithm is further derived under the variational inference framework. Extensive numerical studies on synthetic data and real-world datasets demonstrate the significantly improved performance of the proposed algorithm in terms of multilinear rank learning and missing data recovery.<br></p>-
dc.languageeng-
dc.publisherInstitute of Electrical and Electronics Engineers-
dc.relation.ispartofIEEE Transactions on Signal Processing-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subjectBayesian Tucker model-
dc.subjectGaussian-Wishart priors-
dc.subjectmultilinear rank estimation-
dc.subjectTensor decomposition-
dc.titleBayesian Tensor Tucker Completion With a Flexible Core-
dc.typeArticle-
dc.identifier.doi10.1109/TSP.2023.3327845-
dc.identifier.scopuseid_2-s2.0-85177023594-
dc.identifier.volume71-
dc.identifier.spage4077-
dc.identifier.epage4091-
dc.identifier.eissn1941-0476-
dc.identifier.isiWOS:001102090400004-
dc.identifier.issnl1053-587X-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats