File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TAC.2019.2953209
- Scopus: eid_2-s2.0-85086049259
- WOS: WOS:000542947800004
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: SI-ADMM: A Stochastic Inexact ADMM Framework for Stochastic Convex Programs
Title | SI-ADMM: A Stochastic Inexact ADMM Framework for Stochastic Convex Programs |
---|---|
Authors | |
Keywords | Alternating direction method of multiplier (ADMM) Convex optimization Stochastic approximation Stochastic optimization |
Issue Date | 2020 |
Citation | IEEE Transactions on Automatic Control, 2020, v. 65, n. 6, p. 2355-2370 How to Cite? |
Abstract | We consider the structured stochastic convex program requiring the minimization of \mathbb {E}_\xi [\tilde{f}(x,\xi)]+\mathbb {E}_\xi [\tilde{g}(y,\xi)] subject to the constraint Ax + By = b. Motivated by the need for decentralized schemes, we propose a stochastic inexact alternating direction method of multiplier (SI-ADMM) framework where subproblems are solved inexactly via stochastic approximation schemes. we propose a stochastic inexact alternating direction method of multiplier (SI-ADMM) framework where subproblems are solved inexactly via stochastic approximation schemes. Based on this framework, we prove the following: 1) under suitable assumptions on the associated batch-size of samples utilized at each iteration, the SI-ADMM scheme produces a sequence that converges to the unique solution almost surely; 2) if the number of gradient steps (or equivalently, the number of sampled gradients) utilized for solving the subproblems in each iteration increases at a geometric rate, the mean-squared error diminishes to zero at a prescribed geometric rate; and 3) the overall iteration complexity in terms of gradient steps (or equivalently samples) is found to be consistent with the canonical level of \mathcal {O}(1/\epsilon). Preliminary applications on LASSO and distributed regression suggest that the scheme performs well compared to its competitors. |
Persistent Identifier | http://hdl.handle.net/10722/309261 |
ISSN | 2023 Impact Factor: 6.2 2023 SCImago Journal Rankings: 4.501 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Xie, Yue | - |
dc.contributor.author | Shanbhag, Uday V. | - |
dc.date.accessioned | 2021-12-15T03:59:51Z | - |
dc.date.available | 2021-12-15T03:59:51Z | - |
dc.date.issued | 2020 | - |
dc.identifier.citation | IEEE Transactions on Automatic Control, 2020, v. 65, n. 6, p. 2355-2370 | - |
dc.identifier.issn | 0018-9286 | - |
dc.identifier.uri | http://hdl.handle.net/10722/309261 | - |
dc.description.abstract | We consider the structured stochastic convex program requiring the minimization of \mathbb {E}_\xi [\tilde{f}(x,\xi)]+\mathbb {E}_\xi [\tilde{g}(y,\xi)] subject to the constraint Ax + By = b. Motivated by the need for decentralized schemes, we propose a stochastic inexact alternating direction method of multiplier (SI-ADMM) framework where subproblems are solved inexactly via stochastic approximation schemes. we propose a stochastic inexact alternating direction method of multiplier (SI-ADMM) framework where subproblems are solved inexactly via stochastic approximation schemes. Based on this framework, we prove the following: 1) under suitable assumptions on the associated batch-size of samples utilized at each iteration, the SI-ADMM scheme produces a sequence that converges to the unique solution almost surely; 2) if the number of gradient steps (or equivalently, the number of sampled gradients) utilized for solving the subproblems in each iteration increases at a geometric rate, the mean-squared error diminishes to zero at a prescribed geometric rate; and 3) the overall iteration complexity in terms of gradient steps (or equivalently samples) is found to be consistent with the canonical level of \mathcal {O}(1/\epsilon). Preliminary applications on LASSO and distributed regression suggest that the scheme performs well compared to its competitors. | - |
dc.language | eng | - |
dc.relation.ispartof | IEEE Transactions on Automatic Control | - |
dc.subject | Alternating direction method of multiplier (ADMM) | - |
dc.subject | Convex optimization | - |
dc.subject | Stochastic approximation | - |
dc.subject | Stochastic optimization | - |
dc.title | SI-ADMM: A Stochastic Inexact ADMM Framework for Stochastic Convex Programs | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/TAC.2019.2953209 | - |
dc.identifier.scopus | eid_2-s2.0-85086049259 | - |
dc.identifier.volume | 65 | - |
dc.identifier.issue | 6 | - |
dc.identifier.spage | 2355 | - |
dc.identifier.epage | 2370 | - |
dc.identifier.eissn | 1558-2523 | - |
dc.identifier.isi | WOS:000542947800004 | - |