File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1016/j.patcog.2023.109650
- Scopus: eid_2-s2.0-85154595835
- WOS: WOS:001005785300001
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Tensor train factorization under noisy and incomplete data with automatic rank estimation
Title | Tensor train factorization under noisy and incomplete data with automatic rank estimation |
---|---|
Authors | |
Keywords | Bayesian inference Tensor completion Tensor train |
Issue Date | 1-Sep-2023 |
Publisher | Elsevier |
Citation | Pattern Recognition, 2023, v. 141 How to Cite? |
Abstract | As a powerful tool in analyzing multi-dimensional data, tensor train (TT) decomposition shows superior performance compared to other tensor decomposition formats. Existing TT decomposition methods, however, either easily overfit with noise, or require substantial fine-tuning to strike a balance between recovery accuracy and model complexity. To avoid the above shortcomings, this paper treats the TT decomposition in a fully Bayesian perspective, which includes automatic TT rank determination and noise power estimation. Theoretical justification on adopting the Gaussian-product-Gamma priors for inducing sparsity on the slices of the TT cores is provided, thus allowing the model complexity to be automatically determined even when the observed tensor data is noisy and contains many missing values. Furthermore, using the variational inference framework, an effective learning algorithm on the probabilistic model parameters is derived. Simulations on synthetic data demonstrate that the proposed algorithm accurately recovers the underlying TT structure from incomplete noisy observations. Further experiments on image and video data also show its superior performance to other existing TT decomposition algorithms. |
Persistent Identifier | http://hdl.handle.net/10722/339301 |
ISSN | 2023 Impact Factor: 7.5 2023 SCImago Journal Rankings: 2.732 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Xu, Le | - |
dc.contributor.author | Cheng, Lei | - |
dc.contributor.author | Wong, Ngai | - |
dc.contributor.author | Wu, Yik-Chung | - |
dc.date.accessioned | 2024-03-11T10:35:32Z | - |
dc.date.available | 2024-03-11T10:35:32Z | - |
dc.date.issued | 2023-09-01 | - |
dc.identifier.citation | Pattern Recognition, 2023, v. 141 | - |
dc.identifier.issn | 0031-3203 | - |
dc.identifier.uri | http://hdl.handle.net/10722/339301 | - |
dc.description.abstract | <p>As a powerful tool in analyzing multi-dimensional data, tensor train (TT) decomposition shows superior performance compared to other tensor decomposition formats. Existing TT <a href="https://www.sciencedirect.com/topics/computer-science/decomposition-method" title="Learn more about decomposition methods from ScienceDirect's AI-generated Topic Pages">decomposition methods</a>, however, either easily overfit with noise, or require substantial fine-tuning to strike a balance between recovery accuracy and model complexity. To avoid the above shortcomings, this paper treats the TT decomposition in a fully <a href="https://www.sciencedirect.com/topics/computer-science/bayesian-perspective" title="Learn more about Bayesian perspective from ScienceDirect's AI-generated Topic Pages">Bayesian perspective</a>, which includes automatic TT rank determination and noise power estimation. Theoretical justification on adopting the Gaussian-product-Gamma priors for inducing <a href="https://www.sciencedirect.com/topics/computer-science/sparsity" title="Learn more about sparsity from ScienceDirect's AI-generated Topic Pages">sparsity</a> on the slices of the TT cores is provided, thus allowing the model complexity to be automatically determined even when the observed tensor data is noisy and contains many missing values. Furthermore, using the variational inference framework, an effective learning algorithm on the probabilistic model parameters is derived. Simulations on synthetic data demonstrate that the proposed algorithm accurately recovers the underlying TT structure from incomplete noisy observations. Further experiments on image and video data also show its superior performance to other existing TT decomposition algorithms.<br></p> | - |
dc.language | eng | - |
dc.publisher | Elsevier | - |
dc.relation.ispartof | Pattern Recognition | - |
dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
dc.subject | Bayesian inference | - |
dc.subject | Tensor completion | - |
dc.subject | Tensor train | - |
dc.title | Tensor train factorization under noisy and incomplete data with automatic rank estimation | - |
dc.type | Article | - |
dc.identifier.doi | 10.1016/j.patcog.2023.109650 | - |
dc.identifier.scopus | eid_2-s2.0-85154595835 | - |
dc.identifier.volume | 141 | - |
dc.identifier.isi | WOS:001005785300001 | - |
dc.identifier.issnl | 0031-3203 | - |