File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Accelerated algorithms for convex and non-convex optimization on manifolds

TitleAccelerated algorithms for convex and non-convex optimization on manifolds
Authors
KeywordsManifolds
Non-convex optimization
Weakly convex functions
Issue Date6-Feb-2025
PublisherSpringer
Citation
Machine Learning, 2025, v. 114, n. 3 How to Cite?
AbstractWe propose a general scheme for solving convex and non-convex optimization problems on manifolds. The central idea is that, by adding a multiple of the squared retraction distance to the objective function in question, we “convexify” the objective function and solve a series of convex sub-problems in the optimization procedure. Our proposed algorithm adapts to the level of complexity in the objective function without requiring the knowledge of the convexity of non-convexity of the objective function. We show that when the objective function is convex, the algorithm provably converges to the optimum and leads to accelerated convergence. When the objective function is non-convex, the algorithm will converge to a stationary point. Our proposed method unifies insights from Nesterov’s original idea for accelerating gradient descent algorithms with recent developments in optimization algorithms in Euclidean space. We demonstrate the utility of our algorithms on several manifold optimization tasks such as estimating intrinsic and extrinsic Fréchet means on spheres and low-rank matrix factorization with Grassmann manifolds applied to the Netflix rating data set.
Persistent Identifierhttp://hdl.handle.net/10722/355220
ISSN
2023 Impact Factor: 4.3
2023 SCImago Journal Rankings: 1.720

 

DC FieldValueLanguage
dc.contributor.authorLin, Lizhen-
dc.contributor.authorSaparbayeva, Bayan-
dc.contributor.authorZhang, Michael Minyi-
dc.contributor.authorDunson, David B.-
dc.date.accessioned2025-03-29T00:35:24Z-
dc.date.available2025-03-29T00:35:24Z-
dc.date.issued2025-02-06-
dc.identifier.citationMachine Learning, 2025, v. 114, n. 3-
dc.identifier.issn0885-6125-
dc.identifier.urihttp://hdl.handle.net/10722/355220-
dc.description.abstractWe propose a general scheme for solving convex and non-convex optimization problems on manifolds. The central idea is that, by adding a multiple of the squared retraction distance to the objective function in question, we “convexify” the objective function and solve a series of convex sub-problems in the optimization procedure. Our proposed algorithm adapts to the level of complexity in the objective function without requiring the knowledge of the convexity of non-convexity of the objective function. We show that when the objective function is convex, the algorithm provably converges to the optimum and leads to accelerated convergence. When the objective function is non-convex, the algorithm will converge to a stationary point. Our proposed method unifies insights from Nesterov’s original idea for accelerating gradient descent algorithms with recent developments in optimization algorithms in Euclidean space. We demonstrate the utility of our algorithms on several manifold optimization tasks such as estimating intrinsic and extrinsic Fréchet means on spheres and low-rank matrix factorization with Grassmann manifolds applied to the Netflix rating data set.-
dc.languageeng-
dc.publisherSpringer-
dc.relation.ispartofMachine Learning-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subjectManifolds-
dc.subjectNon-convex optimization-
dc.subjectWeakly convex functions-
dc.titleAccelerated algorithms for convex and non-convex optimization on manifolds-
dc.typeArticle-
dc.identifier.doi10.1007/s10994-024-06649-1-
dc.identifier.scopuseid_2-s2.0-85218207173-
dc.identifier.volume114-
dc.identifier.issue3-
dc.identifier.eissn1573-0565-
dc.identifier.issnl0885-6125-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats