File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Article: Online sufficient dimension reduction through sliced inverse regression
Title | Online sufficient dimension reduction through sliced inverse regression |
---|---|
Authors | |
Keywords | Dimension reduction Gradient descent Online learning Perturbation Singular value decomposition Sliced inverse regression |
Issue Date | 2020 |
Citation | Journal of Machine Learning Research, 2020, v. 21 How to Cite? |
Abstract | Sliced inverse regression is an effective paradigm that achieves the goal of dimension reduction through replacing high dimensional covariates with a small number of linear combinations. It does not impose parametric assumptions on the dependence structure. More importantly, such a reduction of dimension is sufficient in that it does not cause loss of information. In this paper, we adapt the stationary sliced inverse regression to cope with the rapidly changing environments. We propose to implement sliced inverse regression in an online fashion. This online learner consists of two steps. In the first step we construct an online estimate for the kernel matrix; in the second step we propose two online algorithms, one is motivated by the perturbation method and the other is originated from the gradient descent optimization, to perform online singular value decomposition. The theoretical properties of this online learner are established. We demonstrate the numerical performance of this online learner through simulations and real world applications. All numerical studies confirm that this online learner performs as well as the batch learner. |
Persistent Identifier | http://hdl.handle.net/10722/328782 |
ISSN | 2023 Impact Factor: 4.3 2023 SCImago Journal Rankings: 2.796 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Cai, Zhanrui | - |
dc.contributor.author | Li, Runze | - |
dc.contributor.author | Zhu, Liping | - |
dc.date.accessioned | 2023-07-22T06:23:58Z | - |
dc.date.available | 2023-07-22T06:23:58Z | - |
dc.date.issued | 2020 | - |
dc.identifier.citation | Journal of Machine Learning Research, 2020, v. 21 | - |
dc.identifier.issn | 1532-4435 | - |
dc.identifier.uri | http://hdl.handle.net/10722/328782 | - |
dc.description.abstract | Sliced inverse regression is an effective paradigm that achieves the goal of dimension reduction through replacing high dimensional covariates with a small number of linear combinations. It does not impose parametric assumptions on the dependence structure. More importantly, such a reduction of dimension is sufficient in that it does not cause loss of information. In this paper, we adapt the stationary sliced inverse regression to cope with the rapidly changing environments. We propose to implement sliced inverse regression in an online fashion. This online learner consists of two steps. In the first step we construct an online estimate for the kernel matrix; in the second step we propose two online algorithms, one is motivated by the perturbation method and the other is originated from the gradient descent optimization, to perform online singular value decomposition. The theoretical properties of this online learner are established. We demonstrate the numerical performance of this online learner through simulations and real world applications. All numerical studies confirm that this online learner performs as well as the batch learner. | - |
dc.language | eng | - |
dc.relation.ispartof | Journal of Machine Learning Research | - |
dc.subject | Dimension reduction | - |
dc.subject | Gradient descent | - |
dc.subject | Online learning | - |
dc.subject | Perturbation | - |
dc.subject | Singular value decomposition | - |
dc.subject | Sliced inverse regression | - |
dc.title | Online sufficient dimension reduction through sliced inverse regression | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.scopus | eid_2-s2.0-85086798987 | - |
dc.identifier.volume | 21 | - |
dc.identifier.eissn | 1533-7928 | - |