File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

Article: On the Convergence Analysis of Cubic Regularized Symmetric Rank-1 Quasi-Newton Method and the Incremental Version in the Application of Large-Scale Problems

TitleOn the Convergence Analysis of Cubic Regularized Symmetric Rank-1 Quasi-Newton Method and the Incremental Version in the Application of Large-Scale Problems
Authors
KeywordsQuasi-Newton method
symmetric rank-1
superlinear convergence rate
cubic regularization
incremental optimization
Issue Date2019
PublisherInstitute of Electrical and Electronics Engineers: Open Access Journals. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber=6287639
Citation
IEEE Access, 2019, v. 7, p. 114042-114059 How to Cite?
AbstractThis paper provides a detailed study on the convergence properties of the cubic regularized symmetric rank-1 (SR1) method (CuREG-SR1) proposed in [2]. The main advantage of incorporating cubic regularization technique to SR1 is to alleviate the problem of indefinite resulting matrix in SR1. However, its convergence under the line search framework is less studied. Here, we first show that CuREG-SR1 converges to a first-order critical point. Moreover, we give a novel result that provided the uniformly independent assumption, the difference between approximated Hessian matrix generated by CuREG-SR1 and the true Hessian is bounded. In addition, we show that for a problem with dimension d, CuREG-SR1 generates q-d superlinear steps every q iterations. We also propose a novel incremental CuREG-SR1 (ICuREG-SR1) algorithm to adapt SR1 and CuREG-SR1 efficiently for solving large scale problems. The basic idea is to incorporate incremental optimization scheme, which updates progressively information for objective function involving a sum of individual functions, which are frequently encountered in large-scale machine learning. Numerical experiments on several machine learning problems show that the proposed algorithm offers superior performance in terms of the gradient magnitude than other conventional algorithms tested.
Persistent Identifierhttp://hdl.handle.net/10722/294063
ISSN
2023 Impact Factor: 3.4
2023 SCImago Journal Rankings: 0.960
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorCHEN, H-
dc.contributor.authorLam, WH-
dc.contributor.authorChan, SC-
dc.date.accessioned2020-11-23T08:25:47Z-
dc.date.available2020-11-23T08:25:47Z-
dc.date.issued2019-
dc.identifier.citationIEEE Access, 2019, v. 7, p. 114042-114059-
dc.identifier.issn2169-3536-
dc.identifier.urihttp://hdl.handle.net/10722/294063-
dc.description.abstractThis paper provides a detailed study on the convergence properties of the cubic regularized symmetric rank-1 (SR1) method (CuREG-SR1) proposed in [2]. The main advantage of incorporating cubic regularization technique to SR1 is to alleviate the problem of indefinite resulting matrix in SR1. However, its convergence under the line search framework is less studied. Here, we first show that CuREG-SR1 converges to a first-order critical point. Moreover, we give a novel result that provided the uniformly independent assumption, the difference between approximated Hessian matrix generated by CuREG-SR1 and the true Hessian is bounded. In addition, we show that for a problem with dimension d, CuREG-SR1 generates q-d superlinear steps every q iterations. We also propose a novel incremental CuREG-SR1 (ICuREG-SR1) algorithm to adapt SR1 and CuREG-SR1 efficiently for solving large scale problems. The basic idea is to incorporate incremental optimization scheme, which updates progressively information for objective function involving a sum of individual functions, which are frequently encountered in large-scale machine learning. Numerical experiments on several machine learning problems show that the proposed algorithm offers superior performance in terms of the gradient magnitude than other conventional algorithms tested.-
dc.languageeng-
dc.publisherInstitute of Electrical and Electronics Engineers: Open Access Journals. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber=6287639-
dc.relation.ispartofIEEE Access-
dc.rightsIEEE Access. Copyright © Institute of Electrical and Electronics Engineers (IEEE): OAJ.-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subjectQuasi-Newton method-
dc.subjectsymmetric rank-1-
dc.subjectsuperlinear convergence rate-
dc.subjectcubic regularization-
dc.subjectincremental optimization-
dc.titleOn the Convergence Analysis of Cubic Regularized Symmetric Rank-1 Quasi-Newton Method and the Incremental Version in the Application of Large-Scale Problems-
dc.typeArticle-
dc.identifier.emailLam, WH: whlam@HKUCC-COM.hku.hk-
dc.identifier.emailChan, SC: scchan@eee.hku.hk-
dc.identifier.authorityLam, WH=rp00136-
dc.identifier.authorityChan, SC=rp00094-
dc.description.naturepublished_or_final_version-
dc.identifier.doi10.1109/ACCESS.2019.2935900-
dc.identifier.scopuseid_2-s2.0-85097342295-
dc.identifier.hkuros319254-
dc.identifier.volume7-
dc.identifier.spage114042-
dc.identifier.epage114059-
dc.identifier.isiWOS:000483022100055-
dc.publisher.placeUnited States-
dc.identifier.issnl2169-3536-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats