File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Tensor train factorization under noisy and incomplete data with automatic rank estimation

TitleTensor train factorization under noisy and incomplete data with automatic rank estimation
Authors
KeywordsBayesian inference
Tensor completion
Tensor train
Issue Date1-Sep-2023
PublisherElsevier
Citation
Pattern Recognition, 2023, v. 141 How to Cite?
Abstract

As a powerful tool in analyzing multi-dimensional data, tensor train (TT) decomposition shows superior performance compared to other tensor decomposition formats. Existing TT decomposition methods, however, either easily overfit with noise, or require substantial fine-tuning to strike a balance between recovery accuracy and model complexity. To avoid the above shortcomings, this paper treats the TT decomposition in a fully Bayesian perspective, which includes automatic TT rank determination and noise power estimation. Theoretical justification on adopting the Gaussian-product-Gamma priors for inducing sparsity on the slices of the TT cores is provided, thus allowing the model complexity to be automatically determined even when the observed tensor data is noisy and contains many missing values. Furthermore, using the variational inference framework, an effective learning algorithm on the probabilistic model parameters is derived. Simulations on synthetic data demonstrate that the proposed algorithm accurately recovers the underlying TT structure from incomplete noisy observations. Further experiments on image and video data also show its superior performance to other existing TT decomposition algorithms.


Persistent Identifierhttp://hdl.handle.net/10722/339301
ISSN
2023 Impact Factor: 7.5
2023 SCImago Journal Rankings: 2.732
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorXu, Le-
dc.contributor.authorCheng, Lei-
dc.contributor.authorWong, Ngai-
dc.contributor.authorWu, Yik-Chung-
dc.date.accessioned2024-03-11T10:35:32Z-
dc.date.available2024-03-11T10:35:32Z-
dc.date.issued2023-09-01-
dc.identifier.citationPattern Recognition, 2023, v. 141-
dc.identifier.issn0031-3203-
dc.identifier.urihttp://hdl.handle.net/10722/339301-
dc.description.abstract<p>As a powerful tool in analyzing multi-dimensional data, tensor train (TT) decomposition shows superior performance compared to other tensor decomposition formats. Existing TT <a href="https://www.sciencedirect.com/topics/computer-science/decomposition-method" title="Learn more about decomposition methods from ScienceDirect's AI-generated Topic Pages">decomposition methods</a>, however, either easily overfit with noise, or require substantial fine-tuning to strike a balance between recovery accuracy and model complexity. To avoid the above shortcomings, this paper treats the TT decomposition in a fully <a href="https://www.sciencedirect.com/topics/computer-science/bayesian-perspective" title="Learn more about Bayesian perspective from ScienceDirect's AI-generated Topic Pages">Bayesian perspective</a>, which includes automatic TT rank determination and noise power estimation. Theoretical justification on adopting the Gaussian-product-Gamma priors for inducing <a href="https://www.sciencedirect.com/topics/computer-science/sparsity" title="Learn more about sparsity from ScienceDirect's AI-generated Topic Pages">sparsity</a> on the slices of the TT cores is provided, thus allowing the model complexity to be automatically determined even when the observed tensor data is noisy and contains many missing values. Furthermore, using the variational inference framework, an effective learning algorithm on the probabilistic model parameters is derived. Simulations on synthetic data demonstrate that the proposed algorithm accurately recovers the underlying TT structure from incomplete noisy observations. Further experiments on image and video data also show its superior performance to other existing TT decomposition algorithms.<br></p>-
dc.languageeng-
dc.publisherElsevier-
dc.relation.ispartofPattern Recognition-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subjectBayesian inference-
dc.subjectTensor completion-
dc.subjectTensor train-
dc.titleTensor train factorization under noisy and incomplete data with automatic rank estimation-
dc.typeArticle-
dc.identifier.doi10.1016/j.patcog.2023.109650-
dc.identifier.scopuseid_2-s2.0-85154595835-
dc.identifier.volume141-
dc.identifier.isiWOS:001005785300001-
dc.identifier.issnl0031-3203-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats