File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1093/imanum/drz007
- Scopus: eid_2-s2.0-85074151438
- WOS: WOS:000491253300016
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Adaptive restart of accelerated gradient methods under local quadratic growth condition
Title | Adaptive restart of accelerated gradient methods under local quadratic growth condition |
---|---|
Authors | |
Keywords | accelerated gradient descent quadratic growth condition restarting unknown error bound |
Issue Date | 2019 |
Publisher | Oxford University Press. The Journal's web site is located at https://academic.oup.com/imajna |
Citation | IMA Journal of Numerical Analysis, 2019, v. 39 n. 4, p. 2069-2095 How to Cite? |
Abstract | By analyzing accelerated proximal gradient methods under a local quadratic growth condition, we show that restarting these algorithms at any frequency gives a globally linearly convergent algorithm. This result was previously known only for long enough frequencies. Then as the rate of convergence depends on the match between the frequency and the quadratic error bound, we design a scheme to automatically adapt the frequency of restart from the observed decrease of the norm of the gradient mapping. Our algorithm has a better theoretical bound than previously proposed methods for the adaptation to the quadratic error bound of the objective. We illustrate the efficiency of the algorithm on Lasso, regularized logistic regression and total variation denoising problems. |
Persistent Identifier | http://hdl.handle.net/10722/275730 |
ISSN | 2023 Impact Factor: 2.3 2023 SCImago Journal Rankings: 1.861 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Fercoq, O | - |
dc.contributor.author | Qu, Z | - |
dc.date.accessioned | 2019-09-10T02:48:31Z | - |
dc.date.available | 2019-09-10T02:48:31Z | - |
dc.date.issued | 2019 | - |
dc.identifier.citation | IMA Journal of Numerical Analysis, 2019, v. 39 n. 4, p. 2069-2095 | - |
dc.identifier.issn | 0272-4979 | - |
dc.identifier.uri | http://hdl.handle.net/10722/275730 | - |
dc.description.abstract | By analyzing accelerated proximal gradient methods under a local quadratic growth condition, we show that restarting these algorithms at any frequency gives a globally linearly convergent algorithm. This result was previously known only for long enough frequencies. Then as the rate of convergence depends on the match between the frequency and the quadratic error bound, we design a scheme to automatically adapt the frequency of restart from the observed decrease of the norm of the gradient mapping. Our algorithm has a better theoretical bound than previously proposed methods for the adaptation to the quadratic error bound of the objective. We illustrate the efficiency of the algorithm on Lasso, regularized logistic regression and total variation denoising problems. | - |
dc.language | eng | - |
dc.publisher | Oxford University Press. The Journal's web site is located at https://academic.oup.com/imajna | - |
dc.relation.ispartof | IMA Journal of Numerical Analysis | - |
dc.rights | Pre-print: Journal Title] ©: [year] [owner as specified on the article] Published by Oxford University Press [on behalf of xxxxxx]. All rights reserved. Pre-print (Once an article is published, preprint notice should be amended to): This is an electronic version of an article published in [include the complete citation information for the final version of the Article as published in the print edition of the Journal.] Post-print: This is a pre-copy-editing, author-produced PDF of an article accepted for publication in [insert journal title] following peer review. The definitive publisher-authenticated version [insert complete citation information here] is available online at: xxxxxxx [insert URL that the author will receive upon publication here]. | - |
dc.subject | accelerated gradient descent | - |
dc.subject | quadratic growth condition | - |
dc.subject | restarting | - |
dc.subject | unknown error bound | - |
dc.title | Adaptive restart of accelerated gradient methods under local quadratic growth condition | - |
dc.type | Article | - |
dc.identifier.email | Qu, Z: zhengqu@hku.hk | - |
dc.identifier.authority | Qu, Z=rp02096 | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1093/imanum/drz007 | - |
dc.identifier.scopus | eid_2-s2.0-85074151438 | - |
dc.identifier.hkuros | 304633 | - |
dc.identifier.volume | 39 | - |
dc.identifier.issue | 4 | - |
dc.identifier.spage | 2069 | - |
dc.identifier.epage | 2095 | - |
dc.identifier.isi | WOS:000491253300016 | - |
dc.publisher.place | United Kingdom | - |
dc.identifier.issnl | 0272-4979 | - |