File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Communication efficient parallel algorithms for optimization on manifolds

TitleCommunication efficient parallel algorithms for optimization on manifolds
Authors
Issue Date2018
Citation
32nd Conference on Neural Information Processing Systems (NeurIPS 2018), Montréal, Canada, 2-8 December 2018. In Advances in Neural Information Processing Systems 31 (NeurIPS 2018), 2018, p. 3574-3584 How to Cite?
AbstractThe last decade has witnessed an explosion in the development of models, theory and computational algorithms for “big data” analysis. In particular, distributed computing has served as a natural and dominating paradigm for statistical inference. However, the existing literature on parallel inference almost exclusively focuses on Euclidean data and parameters. While this assumption is valid for many applications, it is increasingly more common to encounter problems where the data or the parameters lie on a non-Euclidean space, like a manifold for example. Our work aims to fill a critical gap in the literature by generalizing parallel inference algorithms to optimization on manifolds. We show that our proposed algorithm is both communication efficient and carries theoretical convergence guarantees. In addition, we demonstrate the performance of our algorithm to the estimation of Fréchet means on simulated spherical data and the low-rank matrix completion problem over Grassmann manifolds applied to the Netflix prize data set.
Persistent Identifierhttp://hdl.handle.net/10722/296186
ISSN
2020 SCImago Journal Rankings: 1.399
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorSaparbayeva, Bayan-
dc.contributor.authorZhang, Michael Minyi-
dc.contributor.authorLin, Lizhen-
dc.date.accessioned2021-02-11T04:53:01Z-
dc.date.available2021-02-11T04:53:01Z-
dc.date.issued2018-
dc.identifier.citation32nd Conference on Neural Information Processing Systems (NeurIPS 2018), Montréal, Canada, 2-8 December 2018. In Advances in Neural Information Processing Systems 31 (NeurIPS 2018), 2018, p. 3574-3584-
dc.identifier.issn1049-5258-
dc.identifier.urihttp://hdl.handle.net/10722/296186-
dc.description.abstractThe last decade has witnessed an explosion in the development of models, theory and computational algorithms for “big data” analysis. In particular, distributed computing has served as a natural and dominating paradigm for statistical inference. However, the existing literature on parallel inference almost exclusively focuses on Euclidean data and parameters. While this assumption is valid for many applications, it is increasingly more common to encounter problems where the data or the parameters lie on a non-Euclidean space, like a manifold for example. Our work aims to fill a critical gap in the literature by generalizing parallel inference algorithms to optimization on manifolds. We show that our proposed algorithm is both communication efficient and carries theoretical convergence guarantees. In addition, we demonstrate the performance of our algorithm to the estimation of Fréchet means on simulated spherical data and the low-rank matrix completion problem over Grassmann manifolds applied to the Netflix prize data set.-
dc.languageeng-
dc.relation.ispartofAdvances in Neural Information Processing Systems 31 (NeurIPS 2018)-
dc.titleCommunication efficient parallel algorithms for optimization on manifolds-
dc.typeConference_Paper-
dc.description.naturelink_to_OA_fulltext-
dc.identifier.scopuseid_2-s2.0-85064812619-
dc.identifier.spage3574-
dc.identifier.epage3584-
dc.identifier.isiWOS:000461823303056-
dc.identifier.issnl1049-5258-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats