File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TSP.2023.3327845
- Scopus: eid_2-s2.0-85177023594
- WOS: WOS:001102090400004
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Bayesian Tensor Tucker Completion With a Flexible Core
Title | Bayesian Tensor Tucker Completion With a Flexible Core |
---|---|
Authors | |
Keywords | Bayesian Tucker model Gaussian-Wishart priors multilinear rank estimation Tensor decomposition |
Issue Date | 2-Nov-2023 |
Publisher | Institute of Electrical and Electronics Engineers |
Citation | IEEE Transactions on Signal Processing, 2023, v. 71, p. 4077-4091 How to Cite? |
Abstract | Tensor completion is a vital task in multi-dimensional signal processing and machine learning. To recover the missing data in a tensor, various low-rank structures of a tensor can be assumed, and Tucker format is a popular choice. However, the promising capability of Tucker completion is realized only when we can determine a suitable multilinear rank, which controls the model complexity and thus is essential to avoid overfitting/underfitting. Rather than exhaustively searching the best multilinear rank, which is computationally inefficient, recent advances have proposed a Bayesian way to learn the multilinear rank from training data automatically. However, in prior arts, only a single parameter is dedicated to learn the variance of the core tensor elements. This rigid assumption restricts the modeling capabilities of existing methods in real-world data, where the core tensor elements may have a wide range of variances. To have a flexible core tensor while still retaining succinct Bayesian modeling, we first bridge the tensor Tucker decomposition to the canonical polyadic decomposition (CPD) with low-rank factor matrices, and then propose a novel Bayesian modeling based on the Gaussian-inverse Wishart prior. Inference algorithm is further derived under the variational inference framework. Extensive numerical studies on synthetic data and real-world datasets demonstrate the significantly improved performance of the proposed algorithm in terms of multilinear rank learning and missing data recovery. |
Persistent Identifier | http://hdl.handle.net/10722/339304 |
ISSN | 2023 Impact Factor: 4.6 2023 SCImago Journal Rankings: 2.520 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Tong, Xueke | - |
dc.contributor.author | Cheng, Lei | - |
dc.contributor.author | Wu, Yik-Chung | - |
dc.date.accessioned | 2024-03-11T10:35:33Z | - |
dc.date.available | 2024-03-11T10:35:33Z | - |
dc.date.issued | 2023-11-02 | - |
dc.identifier.citation | IEEE Transactions on Signal Processing, 2023, v. 71, p. 4077-4091 | - |
dc.identifier.issn | 1053-587X | - |
dc.identifier.uri | http://hdl.handle.net/10722/339304 | - |
dc.description.abstract | <p>Tensor completion is a vital task in multi-dimensional signal processing and machine learning. To recover the missing data in a tensor, various low-rank structures of a tensor can be assumed, and Tucker format is a popular choice. However, the promising capability of Tucker completion is realized only when we can determine a suitable multilinear rank, which controls the model complexity and thus is essential to avoid overfitting/underfitting. Rather than exhaustively searching the best multilinear rank, which is computationally inefficient, recent advances have proposed a Bayesian way to learn the multilinear rank from training data automatically. However, in prior arts, only a single parameter is dedicated to learn the variance of the core tensor elements. This rigid assumption restricts the modeling capabilities of existing methods in real-world data, where the core tensor elements may have a wide range of variances. To have a flexible core tensor while still retaining succinct Bayesian modeling, we first bridge the tensor Tucker decomposition to the canonical polyadic decomposition (CPD) with low-rank factor matrices, and then propose a novel Bayesian modeling based on the Gaussian-inverse Wishart prior. Inference algorithm is further derived under the variational inference framework. Extensive numerical studies on synthetic data and real-world datasets demonstrate the significantly improved performance of the proposed algorithm in terms of multilinear rank learning and missing data recovery.<br></p> | - |
dc.language | eng | - |
dc.publisher | Institute of Electrical and Electronics Engineers | - |
dc.relation.ispartof | IEEE Transactions on Signal Processing | - |
dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
dc.subject | Bayesian Tucker model | - |
dc.subject | Gaussian-Wishart priors | - |
dc.subject | multilinear rank estimation | - |
dc.subject | Tensor decomposition | - |
dc.title | Bayesian Tensor Tucker Completion With a Flexible Core | - |
dc.type | Article | - |
dc.identifier.doi | 10.1109/TSP.2023.3327845 | - |
dc.identifier.scopus | eid_2-s2.0-85177023594 | - |
dc.identifier.volume | 71 | - |
dc.identifier.spage | 4077 | - |
dc.identifier.epage | 4091 | - |
dc.identifier.eissn | 1941-0476 | - |
dc.identifier.isi | WOS:001102090400004 | - |
dc.identifier.issnl | 1053-587X | - |