File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1137/15M1035653
- Scopus: eid_2-s2.0-84976892498
- WOS: WOS:000385282800032
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Incremental regularized least squares for dimensionality reduction of large-scale data
Title | Incremental regularized least squares for dimensionality reduction of large-scale data |
---|---|
Authors | |
Keywords | Lsqr Incremental regularized least squares Linear discriminant analysis Supervised dimensionality reduction |
Issue Date | 2016 |
Citation | SIAM Journal on Scientific Computing, 2016, v. 38, n. 3, p. B414-B439 How to Cite? |
Abstract | © 2016 Society for Industrial and Applied Mathematics. Over the past few decades, much attention has been drawn to large-scale incremental data analysis, where researchers are faced with huge amounts of high-dimensional data acquired incrementally. In such a case, conventional algorithms that compute the result from scratch whenever a new sample comes are highly inefficient. To conquer this problem, we propose a new incremental algorithm incremental regularized least squares (IRLS) that incrementally computes the solution to the regularized least squares (RLS) problem with multiple columns on the right-hand side. More specifically, for an RLS problem with c (c > 1) columns on the right-hand side, we update its unique solution by solving an RLS problem with a single column on the right-hand side whenever a new sample arrives, instead of solving an RLS problem with c columns on the right-hand side from scratch. As a direct application of IRLS, we consider the supervised dimensionality reduction of large-scale data and focus on linear discriminant analysis (LDA). We first propose a new batch LDA model that is closely related to the RLS problem, and then apply IRLS to develop a new incremental LDA algorithm. Experimental results on real-world datasets demonstrate the effectiveness and efficiency of our algorithms. |
Persistent Identifier | http://hdl.handle.net/10722/277034 |
ISSN | 2023 Impact Factor: 3.0 2023 SCImago Journal Rankings: 1.803 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Zhang, Xiaowei | - |
dc.contributor.author | Cheng, Li | - |
dc.contributor.author | Chu, Delin | - |
dc.contributor.author | Liao, Li Zhi | - |
dc.contributor.author | Ng, Michael K. | - |
dc.contributor.author | Tan, Roger C.E. | - |
dc.date.accessioned | 2019-09-18T08:35:24Z | - |
dc.date.available | 2019-09-18T08:35:24Z | - |
dc.date.issued | 2016 | - |
dc.identifier.citation | SIAM Journal on Scientific Computing, 2016, v. 38, n. 3, p. B414-B439 | - |
dc.identifier.issn | 1064-8275 | - |
dc.identifier.uri | http://hdl.handle.net/10722/277034 | - |
dc.description.abstract | © 2016 Society for Industrial and Applied Mathematics. Over the past few decades, much attention has been drawn to large-scale incremental data analysis, where researchers are faced with huge amounts of high-dimensional data acquired incrementally. In such a case, conventional algorithms that compute the result from scratch whenever a new sample comes are highly inefficient. To conquer this problem, we propose a new incremental algorithm incremental regularized least squares (IRLS) that incrementally computes the solution to the regularized least squares (RLS) problem with multiple columns on the right-hand side. More specifically, for an RLS problem with c (c > 1) columns on the right-hand side, we update its unique solution by solving an RLS problem with a single column on the right-hand side whenever a new sample arrives, instead of solving an RLS problem with c columns on the right-hand side from scratch. As a direct application of IRLS, we consider the supervised dimensionality reduction of large-scale data and focus on linear discriminant analysis (LDA). We first propose a new batch LDA model that is closely related to the RLS problem, and then apply IRLS to develop a new incremental LDA algorithm. Experimental results on real-world datasets demonstrate the effectiveness and efficiency of our algorithms. | - |
dc.language | eng | - |
dc.relation.ispartof | SIAM Journal on Scientific Computing | - |
dc.subject | Lsqr | - |
dc.subject | Incremental regularized least squares | - |
dc.subject | Linear discriminant analysis | - |
dc.subject | Supervised dimensionality reduction | - |
dc.title | Incremental regularized least squares for dimensionality reduction of large-scale data | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1137/15M1035653 | - |
dc.identifier.scopus | eid_2-s2.0-84976892498 | - |
dc.identifier.volume | 38 | - |
dc.identifier.issue | 3 | - |
dc.identifier.spage | B414 | - |
dc.identifier.epage | B439 | - |
dc.identifier.eissn | 1095-7200 | - |
dc.identifier.isi | WOS:000385282800032 | - |