File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Fine-grained Generalisation Analysis of Inductive Matrix Completion

TitleFine-grained Generalisation Analysis of Inductive Matrix Completion
Authors
Issue Date2021
Citation
Advances in Neural Information Processing Systems, 2021, v. 31, p. 25540-25552 How to Cite?
AbstractIn this paper, we bridge the gap between the state-of-the-art theoretical results for matrix completion with the nuclear norm and their equivalent in inductive matrix completion: (1) In the distribution-free setting, we prove sample complexity bounds improving the previously best rate of rd2 to d3/2√r logpdq, where d is the dimension of the side information and r is the rank. (2) We introduce the (smoothed) adjusted trace-norm minimization strategy, an inductive analogue of the weighted trace norm, for which we show guarantees of the order O(dr log (d)) under arbitrary sampling. In the inductive case, a similar rate was previously achieved only under uniform sampling and for exact recovery. Both our results align with the state of the art in the particular case of standard (non-inductive) matrix completion, where they are known to be tight up to log terms. Experiments further confirm that our strategy outperforms standard inductive matrix completion on various synthetic datasets and real problems, justifying its place as an important tool in the arsenal of methods for matrix completion using side information.
Persistent Identifierhttp://hdl.handle.net/10722/329859
ISSN
2020 SCImago Journal Rankings: 1.399

 

DC FieldValueLanguage
dc.contributor.authorLedent, Antoine-
dc.contributor.authorAlves, Rodrigo-
dc.contributor.authorLei, Yunwen-
dc.contributor.authorKloft, Marius-
dc.date.accessioned2023-08-09T03:35:52Z-
dc.date.available2023-08-09T03:35:52Z-
dc.date.issued2021-
dc.identifier.citationAdvances in Neural Information Processing Systems, 2021, v. 31, p. 25540-25552-
dc.identifier.issn1049-5258-
dc.identifier.urihttp://hdl.handle.net/10722/329859-
dc.description.abstractIn this paper, we bridge the gap between the state-of-the-art theoretical results for matrix completion with the nuclear norm and their equivalent in inductive matrix completion: (1) In the distribution-free setting, we prove sample complexity bounds improving the previously best rate of rd2 to d3/2√r logpdq, where d is the dimension of the side information and r is the rank. (2) We introduce the (smoothed) adjusted trace-norm minimization strategy, an inductive analogue of the weighted trace norm, for which we show guarantees of the order O(dr log (d)) under arbitrary sampling. In the inductive case, a similar rate was previously achieved only under uniform sampling and for exact recovery. Both our results align with the state of the art in the particular case of standard (non-inductive) matrix completion, where they are known to be tight up to log terms. Experiments further confirm that our strategy outperforms standard inductive matrix completion on various synthetic datasets and real problems, justifying its place as an important tool in the arsenal of methods for matrix completion using side information.-
dc.languageeng-
dc.relation.ispartofAdvances in Neural Information Processing Systems-
dc.titleFine-grained Generalisation Analysis of Inductive Matrix Completion-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-85131916934-
dc.identifier.volume31-
dc.identifier.spage25540-
dc.identifier.epage25552-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats