File Download
Supplementary
-
Citations:
- Appears in Collections:
Article: L-SVRG and L-Katyusha with Arbitrary Sampling
Title | L-SVRG and L-Katyusha with Arbitrary Sampling |
---|---|
Authors | |
Keywords | L-SVRG L-Katyusha Arbitrary sampling Expected smoothness ESO |
Issue Date | 2021 |
Publisher | MIT Press. The Journal's web site is located at http://mitpress.mit.edu/jmlr |
Citation | Journal of Machine Learning Research, 2021, v. 22, p. 1-49 How to Cite? |
Abstract | We develop and analyze a new family of nonaccelerated and accelerated loopless variancereduced methods for finite-sum optimization problems. Our convergence analysis relies on a
novel expected smoothness condition which upper bounds the variance of the stochastic
gradient estimation by a constant times a distance-like function. This allows us to handle
with ease arbitrary sampling schemes as well as the nonconvex case. We perform an indepth estimation of these expected smoothness parameters and propose new importance
samplings which allow linear speedup when the expected minibatch size is in a certain
range. Furthermore, a connection between these expected smoothness parameters and
expected separable overapproximation (ESO) is established, which allows us to exploit data
sparsity as well. Our general methods and results recover as special cases the loop |
Description | Open Access Journal |
Persistent Identifier | http://hdl.handle.net/10722/307673 |
ISSN | 2021 Impact Factor: 5.177 2020 SCImago Journal Rankings: 1.240 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Qian, X | - |
dc.contributor.author | Qu, Z | - |
dc.contributor.author | Richtarik, P | - |
dc.date.accessioned | 2021-11-12T13:36:08Z | - |
dc.date.available | 2021-11-12T13:36:08Z | - |
dc.date.issued | 2021 | - |
dc.identifier.citation | Journal of Machine Learning Research, 2021, v. 22, p. 1-49 | - |
dc.identifier.issn | 1532-4435 | - |
dc.identifier.uri | http://hdl.handle.net/10722/307673 | - |
dc.description | Open Access Journal | - |
dc.description.abstract | We develop and analyze a new family of nonaccelerated and accelerated loopless variancereduced methods for finite-sum optimization problems. Our convergence analysis relies on a novel expected smoothness condition which upper bounds the variance of the stochastic gradient estimation by a constant times a distance-like function. This allows us to handle with ease arbitrary sampling schemes as well as the nonconvex case. We perform an indepth estimation of these expected smoothness parameters and propose new importance samplings which allow linear speedup when the expected minibatch size is in a certain range. Furthermore, a connection between these expected smoothness parameters and expected separable overapproximation (ESO) is established, which allows us to exploit data sparsity as well. Our general methods and results recover as special cases the loop | - |
dc.language | eng | - |
dc.publisher | MIT Press. The Journal's web site is located at http://mitpress.mit.edu/jmlr | - |
dc.relation.ispartof | Journal of Machine Learning Research | - |
dc.rights | Journal of Machine Learning Research. Copyright © MIT Press. | - |
dc.subject | L-SVRG | - |
dc.subject | L-Katyusha | - |
dc.subject | Arbitrary sampling | - |
dc.subject | Expected smoothness | - |
dc.subject | ESO | - |
dc.title | L-SVRG and L-Katyusha with Arbitrary Sampling | - |
dc.type | Article | - |
dc.identifier.email | Qu, Z: zhengqu@hku.hk | - |
dc.identifier.authority | Qu, Z=rp02096 | - |
dc.description.nature | link_to_OA_fulltext | - |
dc.identifier.hkuros | 329924 | - |
dc.identifier.volume | 22 | - |
dc.identifier.spage | 1 | - |
dc.identifier.epage | 49 | - |
dc.publisher.place | United States | - |