File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
  • Find via Find It@HKUL
Supplementary

Conference Paper: SDNA: Stochastic Dual Newton Ascent for epirical risk minimization

TitleSDNA: Stochastic Dual Newton Ascent for epirical risk minimization
Authors
Issue Date2016
PublisherMIT Press. The Journal's web site is located at http://mitpress.mit.edu/jmlr
Citation
The 33 rd International Conference on Machine Learning (ICML 2016), New York, NY., 19-24 June 2016. In JMLR: Workshop and Conference Proceedings, 2016, v. 48, p. 1-10 How to Cite?
AbstractWe propose a new algorithm for minimizing regularized empirical loss: Stochastic Dual Newton Ascent (SDNA). Our method is dual in nature: in each iteration we update a random subset of the dual variables. However, unlike existing methods such as stochastic dual coordinate ascent, SDNA is capable of utilizing all local curvature information contained in the examples, which leads to striking improvements in both theory and practice – sometimes by orders of magnitude. In the special case when an L2-regularizer is used in the primal, the dual problem is a concave quadratic maximization problem plus a separable term. In this regime, SDNA in each step solves a proximal subproblem involving a random principal submatrix of the Hessian of the quadratic function; whence the name of the method.
DescriptionThis journal vol. entitled: Proceedings of the 33 rd International Conference on Machine Learning, ICML 2016
The full version of this paper can be found on https://arxiv.org/abs/1502.02268
Persistent Identifierhttp://hdl.handle.net/10722/235018
ISSN
2021 Impact Factor: 5.177
2020 SCImago Journal Rankings: 1.240

 

DC FieldValueLanguage
dc.contributor.authorQu, Z-
dc.contributor.authorRichtarik, P-
dc.contributor.authorTakac, M-
dc.contributor.authorFercoq, O-
dc.date.accessioned2016-10-14T13:50:45Z-
dc.date.available2016-10-14T13:50:45Z-
dc.date.issued2016-
dc.identifier.citationThe 33 rd International Conference on Machine Learning (ICML 2016), New York, NY., 19-24 June 2016. In JMLR: Workshop and Conference Proceedings, 2016, v. 48, p. 1-10-
dc.identifier.issn1532-4435-
dc.identifier.urihttp://hdl.handle.net/10722/235018-
dc.descriptionThis journal vol. entitled: Proceedings of the 33 rd International Conference on Machine Learning, ICML 2016-
dc.descriptionThe full version of this paper can be found on https://arxiv.org/abs/1502.02268-
dc.description.abstractWe propose a new algorithm for minimizing regularized empirical loss: Stochastic Dual Newton Ascent (SDNA). Our method is dual in nature: in each iteration we update a random subset of the dual variables. However, unlike existing methods such as stochastic dual coordinate ascent, SDNA is capable of utilizing all local curvature information contained in the examples, which leads to striking improvements in both theory and practice – sometimes by orders of magnitude. In the special case when an L2-regularizer is used in the primal, the dual problem is a concave quadratic maximization problem plus a separable term. In this regime, SDNA in each step solves a proximal subproblem involving a random principal submatrix of the Hessian of the quadratic function; whence the name of the method.-
dc.languageeng-
dc.publisherMIT Press. The Journal's web site is located at http://mitpress.mit.edu/jmlr-
dc.relation.ispartofJournal of Machine Learning Research-
dc.rightsJournal of Machine Learning Research. Copyright © MIT Press.-
dc.rightsAuthor holds the copyright-
dc.titleSDNA: Stochastic Dual Newton Ascent for epirical risk minimization-
dc.typeConference_Paper-
dc.identifier.emailQu, Z: zhengqu@hku.hk-
dc.identifier.authorityQu, Z=rp02096-
dc.description.naturepublished_or_final_version-
dc.identifier.hkuros269840-
dc.identifier.volume48-
dc.identifier.spage1-
dc.identifier.epage10-
dc.publisher.placeUnited States-
dc.customcontrol.immutablesml 161017 - embargo till 170601-
dc.identifier.issnl1532-4435-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats