File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TNNLS.2015.2472284
- Scopus: eid_2-s2.0-85019316719
- WOS: WOS:000386940300010
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Robust Kernel Low-Rank Representation
Title | Robust Kernel Low-Rank Representation |
---|---|
Authors | |
Keywords | kernel methods Low-rank representation (LRR) |
Issue Date | 2016 |
Citation | IEEE Transactions on Neural Networks and Learning Systems, 2016, v. 27, n. 11, p. 2268-2281 How to Cite? |
Abstract | Recently, low-rank representation (LRR) has shown promising performance in many real-world applications such as face clustering. However, LRR may not achieve satisfactory results when dealing with the data from nonlinear subspaces, since it is originally designed to handle the data from linear subspaces in the input space. Meanwhile, the kernel-based methods deal with the nonlinear data by mapping it from the original input space to a new feature space through a kernel-induced mapping. To effectively cope with the nonlinear data, we first propose the kernelized version of LRR in the clean data case. We also present a closed-form solution for the resultant optimization problem. Moreover, to handle corrupted data, we propose the robust kernel LRR (RKLRR) approach, and develop an efficient optimization algorithm to solve it based on the alternating direction method. In particular, we show that both the subproblems in our optimization algorithm can be efficiently and exactly solved, and it is guaranteed to obtain a globally optimal solution. Besides, our proposed algorithm can also solve the original LRR problem, which is a special case of our RKLRR when using the linear kernel. In addition, based on our new optimization technique, the kernelization of some variants of LRR can be similarly achieved. Comprehensive experiments on synthetic data sets and real-world data sets clearly demonstrate the efficiency of our algorithm, as well as the effectiveness of RKLRR and the kernelization of two variants of LRR. |
Persistent Identifier | http://hdl.handle.net/10722/321732 |
ISSN | 2023 Impact Factor: 10.2 2023 SCImago Journal Rankings: 4.170 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Xiao, Shijie | - |
dc.contributor.author | Tan, Mingkui | - |
dc.contributor.author | Xu, Dong | - |
dc.contributor.author | Dong, Zhao Yang | - |
dc.date.accessioned | 2022-11-03T02:21:05Z | - |
dc.date.available | 2022-11-03T02:21:05Z | - |
dc.date.issued | 2016 | - |
dc.identifier.citation | IEEE Transactions on Neural Networks and Learning Systems, 2016, v. 27, n. 11, p. 2268-2281 | - |
dc.identifier.issn | 2162-237X | - |
dc.identifier.uri | http://hdl.handle.net/10722/321732 | - |
dc.description.abstract | Recently, low-rank representation (LRR) has shown promising performance in many real-world applications such as face clustering. However, LRR may not achieve satisfactory results when dealing with the data from nonlinear subspaces, since it is originally designed to handle the data from linear subspaces in the input space. Meanwhile, the kernel-based methods deal with the nonlinear data by mapping it from the original input space to a new feature space through a kernel-induced mapping. To effectively cope with the nonlinear data, we first propose the kernelized version of LRR in the clean data case. We also present a closed-form solution for the resultant optimization problem. Moreover, to handle corrupted data, we propose the robust kernel LRR (RKLRR) approach, and develop an efficient optimization algorithm to solve it based on the alternating direction method. In particular, we show that both the subproblems in our optimization algorithm can be efficiently and exactly solved, and it is guaranteed to obtain a globally optimal solution. Besides, our proposed algorithm can also solve the original LRR problem, which is a special case of our RKLRR when using the linear kernel. In addition, based on our new optimization technique, the kernelization of some variants of LRR can be similarly achieved. Comprehensive experiments on synthetic data sets and real-world data sets clearly demonstrate the efficiency of our algorithm, as well as the effectiveness of RKLRR and the kernelization of two variants of LRR. | - |
dc.language | eng | - |
dc.relation.ispartof | IEEE Transactions on Neural Networks and Learning Systems | - |
dc.subject | kernel methods | - |
dc.subject | Low-rank representation (LRR) | - |
dc.title | Robust Kernel Low-Rank Representation | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/TNNLS.2015.2472284 | - |
dc.identifier.scopus | eid_2-s2.0-85019316719 | - |
dc.identifier.volume | 27 | - |
dc.identifier.issue | 11 | - |
dc.identifier.spage | 2268 | - |
dc.identifier.epage | 2281 | - |
dc.identifier.eissn | 2162-2388 | - |
dc.identifier.isi | WOS:000386940300010 | - |