File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Adaptive primal-dual splitting methods for statistical learning and image processing

TitleAdaptive primal-dual splitting methods for statistical learning and image processing
Authors
Issue Date2015
Citation
Advances in Neural Information Processing Systems, 2015, v. 2015-January, p. 2089-2097 How to Cite?
AbstractThe alternating direction method of multipliers (ADMM) is an important tool for solving complex optimization problems, but it involves minimization sub-steps that are often difficult to solve efficiently. The Primal-Dual Hybrid Gradient (PDHG) method is a powerful alternative that often has simpler sub-steps than ADMM, thus producing lower complexity solvers. Despite the flexibility of this method, PDHG is often impractical because it requires the careful choice of multiple stepsize parameters. There is often no intuitive way to choose these parameters to maximize efficiency, or even achieve convergence. We propose self-adaptive stepsize rules that automatically tune PDHG parameters for optimal convergence. We rigorously analyze our methods, and identify convergence rates. Numerical experiments show that adaptive PDHG has strong advantages over non-adaptive methods in terms of both efficiency and simplicity for the user.
Persistent Identifierhttp://hdl.handle.net/10722/251157
ISSN
2020 SCImago Journal Rankings: 1.399

 

DC FieldValueLanguage
dc.contributor.authorGoldstein, Thomas-
dc.contributor.authorLi, Min-
dc.contributor.authorYuan, Xiaoming-
dc.date.accessioned2018-02-01T01:54:46Z-
dc.date.available2018-02-01T01:54:46Z-
dc.date.issued2015-
dc.identifier.citationAdvances in Neural Information Processing Systems, 2015, v. 2015-January, p. 2089-2097-
dc.identifier.issn1049-5258-
dc.identifier.urihttp://hdl.handle.net/10722/251157-
dc.description.abstractThe alternating direction method of multipliers (ADMM) is an important tool for solving complex optimization problems, but it involves minimization sub-steps that are often difficult to solve efficiently. The Primal-Dual Hybrid Gradient (PDHG) method is a powerful alternative that often has simpler sub-steps than ADMM, thus producing lower complexity solvers. Despite the flexibility of this method, PDHG is often impractical because it requires the careful choice of multiple stepsize parameters. There is often no intuitive way to choose these parameters to maximize efficiency, or even achieve convergence. We propose self-adaptive stepsize rules that automatically tune PDHG parameters for optimal convergence. We rigorously analyze our methods, and identify convergence rates. Numerical experiments show that adaptive PDHG has strong advantages over non-adaptive methods in terms of both efficiency and simplicity for the user.-
dc.languageeng-
dc.relation.ispartofAdvances in Neural Information Processing Systems-
dc.titleAdaptive primal-dual splitting methods for statistical learning and image processing-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-84965146424-
dc.identifier.volume2015-January-
dc.identifier.spage2089-
dc.identifier.epage2097-
dc.identifier.issnl1049-5258-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats