File Download
  Links for fulltext
     (May Require Subscription)
  • Find via Find It@HKUL
Supplementary

Article: L-SVRG and L-Katyusha with Arbitrary Sampling

TitleL-SVRG and L-Katyusha with Arbitrary Sampling
Authors
KeywordsL-SVRG
L-Katyusha
Arbitrary sampling
Expected smoothness
ESO
Issue Date2021
PublisherMIT Press. The Journal's web site is located at http://mitpress.mit.edu/jmlr
Citation
Journal of Machine Learning Research, 2021, v. 22, p. 1-49 How to Cite?
AbstractWe develop and analyze a new family of nonaccelerated and accelerated loopless variancereduced methods for finite-sum optimization problems. Our convergence analysis relies on a novel expected smoothness condition which upper bounds the variance of the stochastic gradient estimation by a constant times a distance-like function. This allows us to handle with ease arbitrary sampling schemes as well as the nonconvex case. We perform an indepth estimation of these expected smoothness parameters and propose new importance samplings which allow linear speedup when the expected minibatch size is in a certain range. Furthermore, a connection between these expected smoothness parameters and expected separable overapproximation (ESO) is established, which allows us to exploit data sparsity as well. Our general methods and results recover as special cases the loop
DescriptionOpen Access Journal
Persistent Identifierhttp://hdl.handle.net/10722/307673
ISSN
2021 Impact Factor: 5.177
2020 SCImago Journal Rankings: 1.240

 

DC FieldValueLanguage
dc.contributor.authorQian, X-
dc.contributor.authorQu, Z-
dc.contributor.authorRichtarik, P-
dc.date.accessioned2021-11-12T13:36:08Z-
dc.date.available2021-11-12T13:36:08Z-
dc.date.issued2021-
dc.identifier.citationJournal of Machine Learning Research, 2021, v. 22, p. 1-49-
dc.identifier.issn1532-4435-
dc.identifier.urihttp://hdl.handle.net/10722/307673-
dc.descriptionOpen Access Journal-
dc.description.abstractWe develop and analyze a new family of nonaccelerated and accelerated loopless variancereduced methods for finite-sum optimization problems. Our convergence analysis relies on a novel expected smoothness condition which upper bounds the variance of the stochastic gradient estimation by a constant times a distance-like function. This allows us to handle with ease arbitrary sampling schemes as well as the nonconvex case. We perform an indepth estimation of these expected smoothness parameters and propose new importance samplings which allow linear speedup when the expected minibatch size is in a certain range. Furthermore, a connection between these expected smoothness parameters and expected separable overapproximation (ESO) is established, which allows us to exploit data sparsity as well. Our general methods and results recover as special cases the loop-
dc.languageeng-
dc.publisherMIT Press. The Journal's web site is located at http://mitpress.mit.edu/jmlr-
dc.relation.ispartofJournal of Machine Learning Research-
dc.rightsJournal of Machine Learning Research. Copyright © MIT Press.-
dc.subjectL-SVRG-
dc.subjectL-Katyusha-
dc.subjectArbitrary sampling-
dc.subjectExpected smoothness-
dc.subjectESO-
dc.titleL-SVRG and L-Katyusha with Arbitrary Sampling-
dc.typeArticle-
dc.identifier.emailQu, Z: zhengqu@hku.hk-
dc.identifier.authorityQu, Z=rp02096-
dc.description.naturelink_to_OA_fulltext-
dc.identifier.hkuros329924-
dc.identifier.volume22-
dc.identifier.spage1-
dc.identifier.epage49-
dc.publisher.placeUnited States-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats