File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TNNLS.2019.2957843
- Scopus: eid_2-s2.0-85086036513
- PMID: 31902778
- WOS: WOS:000587699700029
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: A Stochastic Quasi-Newton Method for Large-Scale Nonconvex Optimization With Applications
Title | A Stochastic Quasi-Newton Method for Large-Scale Nonconvex Optimization With Applications |
---|---|
Authors | |
Keywords | Damped parameter limited memory BFGS (LBFGS) nonconjugate exponential models nonconvex optimization stochastic quasi-Newton (SQN) method |
Issue Date | 2020 |
Publisher | Institute of Electrical and Electronics Engineers. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=72 |
Citation | IEEE Transactions on Neural Networks and Learning Systems, 2020, v. 31 n. 11, p. 4776-4790 How to Cite? |
Abstract | Ensuring the positive definiteness and avoiding ill conditioning of the Hessian update in the stochastic Broyden-Fletcher-Goldfarb-Shanno (BFGS) method are significant in solving nonconvex problems. This article proposes a novel stochastic version of a damped and regularized BFGS method for addressing the above problems. While the proposed regularized strategy helps to prevent the BFGS matrix from being close to singularity, the new damped parameter further ensures the positivity of the product of correction pairs. To alleviate the computational cost of the stochastic limited memory BFGS (LBFGS) updates and to improve its robustness, the curvature information is updated using the averaged iterate at spaced intervals. The effectiveness of the proposed method is evaluated through the logistic regression and Bayesian logistic regression problems in machine learning. Numerical experiments are conducted by using both synthetic data set and several real data sets. The results show that the proposed method generally outperforms the stochastic damped LBFGS (SdLBFGS) method. In particular, for problems with small sample sizes, our method has shown superior performance and is capable of mitigating ill-conditioned problems. Furthermore, our method is more robust to the variations of the batch size and memory size than the SdLBFGS method. |
Persistent Identifier | http://hdl.handle.net/10722/294070 |
ISSN | 2023 Impact Factor: 10.2 2023 SCImago Journal Rankings: 4.170 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | CHEN, H | - |
dc.contributor.author | Wu, HC | - |
dc.contributor.author | Chan, SC | - |
dc.contributor.author | Lam, WH | - |
dc.date.accessioned | 2020-11-23T08:25:54Z | - |
dc.date.available | 2020-11-23T08:25:54Z | - |
dc.date.issued | 2020 | - |
dc.identifier.citation | IEEE Transactions on Neural Networks and Learning Systems, 2020, v. 31 n. 11, p. 4776-4790 | - |
dc.identifier.issn | 2162-237X | - |
dc.identifier.uri | http://hdl.handle.net/10722/294070 | - |
dc.description.abstract | Ensuring the positive definiteness and avoiding ill conditioning of the Hessian update in the stochastic Broyden-Fletcher-Goldfarb-Shanno (BFGS) method are significant in solving nonconvex problems. This article proposes a novel stochastic version of a damped and regularized BFGS method for addressing the above problems. While the proposed regularized strategy helps to prevent the BFGS matrix from being close to singularity, the new damped parameter further ensures the positivity of the product of correction pairs. To alleviate the computational cost of the stochastic limited memory BFGS (LBFGS) updates and to improve its robustness, the curvature information is updated using the averaged iterate at spaced intervals. The effectiveness of the proposed method is evaluated through the logistic regression and Bayesian logistic regression problems in machine learning. Numerical experiments are conducted by using both synthetic data set and several real data sets. The results show that the proposed method generally outperforms the stochastic damped LBFGS (SdLBFGS) method. In particular, for problems with small sample sizes, our method has shown superior performance and is capable of mitigating ill-conditioned problems. Furthermore, our method is more robust to the variations of the batch size and memory size than the SdLBFGS method. | - |
dc.language | eng | - |
dc.publisher | Institute of Electrical and Electronics Engineers. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=72 | - |
dc.relation.ispartof | IEEE Transactions on Neural Networks and Learning Systems | - |
dc.rights | IEEE Transactions on Neural Networks and Learning Systems. Copyright © Institute of Electrical and Electronics Engineers. | - |
dc.rights | ©20xx IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | - |
dc.subject | Damped parameter | - |
dc.subject | limited memory BFGS (LBFGS) | - |
dc.subject | nonconjugate exponential models | - |
dc.subject | nonconvex optimization | - |
dc.subject | stochastic quasi-Newton (SQN) method | - |
dc.title | A Stochastic Quasi-Newton Method for Large-Scale Nonconvex Optimization With Applications | - |
dc.type | Article | - |
dc.identifier.email | Wu, HC: hcwueee@hku.hk | - |
dc.identifier.email | Chan, SC: scchan@eee.hku.hk | - |
dc.identifier.email | Lam, WH: whlam@HKUCC-COM.hku.hk | - |
dc.identifier.authority | Chan, SC=rp00094 | - |
dc.identifier.authority | Lam, WH=rp00136 | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/TNNLS.2019.2957843 | - |
dc.identifier.pmid | 31902778 | - |
dc.identifier.scopus | eid_2-s2.0-85086036513 | - |
dc.identifier.hkuros | 319280 | - |
dc.identifier.volume | 31 | - |
dc.identifier.issue | 11 | - |
dc.identifier.spage | 4776 | - |
dc.identifier.epage | 4790 | - |
dc.identifier.isi | WOS:000587699700029 | - |
dc.publisher.place | United States | - |