File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TCSVT.2007.893837
- Scopus: eid_2-s2.0-34247597810
- WOS: WOS:000246191900007
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Nonlinear discriminant analysis on embedded manifold
Title | Nonlinear discriminant analysis on embedded manifold |
---|---|
Authors | |
Keywords | Kernel design Kernel machine Kernel selection Linear discriminant analysis (LDA) Manifold learning Principal component analysis (PCA) Subspace learning |
Issue Date | 2007 |
Citation | IEEE Transactions on Circuits and Systems for Video Technology, 2007, v. 17, n. 4, p. 468-477 How to Cite? |
Abstract | Traditional manifold learning algorithms, such as ISOMAP, LLE, and Laplacian Eigenmap, mainly focus on un-covering the latent low-dimensional geometry structure of the training samples in an unsupervised manner where useful class information is ignored. Therefore, the derived low-dimensional representations are not necessarily optimal in discriminative capability. In this paper, we study the discriminant analysis problem by considering the nonlinear manifold structure of data space. To this end, firstly, a new clustering algorithm, called Infra-Cluster Balanced K-Means (ICBKM), is proposed to partition the samples into multiple clusters while ensure that there are balanced samples for the classes within each cluster; approximately, each cluster can be considered as a local patch on the embedded manifold. Then, the local discriminative projections for different clusters are simultaneously calculated by optimizing the global Fisher Criterion based on the cluster weighted data representation. Compared with traditional linear/kernel discriminant analysis (KDA) algorithms, our proposed algorithm has the following characteristics: 1) it essentially is a KDA algorithm with specific geometry-adaptive-kernel tailored to the specific data structure, in contrast to traditional KDA in which the kernel is fixed and independent to the data set; 2) it is approximately a locally linear while globally nonlinear discriminant analyzer; 3) it does not need to store the original samples for computing the low-dimensional representation of a new data; and 4) it is computationally efficient compared with traditional KDA when the sample number is large. The toy problem on artificial data demonstrates the effectiveness of our proposed algorithm in deriving discriminative representations for problems with nonlinear classification hyperplane. The face recognition experiments on YALE and CMU PIE databases show that our proposed algorithm significantly outperforms linear discriminant analysis (LDA) as well as Mixture LDA, and has higher accuracy than KDA with traditional kernels. © 2007 IEEE. |
Persistent Identifier | http://hdl.handle.net/10722/321953 |
ISSN | 2023 Impact Factor: 8.3 2023 SCImago Journal Rankings: 2.299 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Yan, Shuicheng | - |
dc.contributor.author | Hu, Yuxiao | - |
dc.contributor.author | Xu, Dong | - |
dc.contributor.author | Zhang, Hong Jiang | - |
dc.contributor.author | Zhang, Benyu | - |
dc.contributor.author | Cheng, Qiansheng | - |
dc.date.accessioned | 2022-11-03T02:22:35Z | - |
dc.date.available | 2022-11-03T02:22:35Z | - |
dc.date.issued | 2007 | - |
dc.identifier.citation | IEEE Transactions on Circuits and Systems for Video Technology, 2007, v. 17, n. 4, p. 468-477 | - |
dc.identifier.issn | 1051-8215 | - |
dc.identifier.uri | http://hdl.handle.net/10722/321953 | - |
dc.description.abstract | Traditional manifold learning algorithms, such as ISOMAP, LLE, and Laplacian Eigenmap, mainly focus on un-covering the latent low-dimensional geometry structure of the training samples in an unsupervised manner where useful class information is ignored. Therefore, the derived low-dimensional representations are not necessarily optimal in discriminative capability. In this paper, we study the discriminant analysis problem by considering the nonlinear manifold structure of data space. To this end, firstly, a new clustering algorithm, called Infra-Cluster Balanced K-Means (ICBKM), is proposed to partition the samples into multiple clusters while ensure that there are balanced samples for the classes within each cluster; approximately, each cluster can be considered as a local patch on the embedded manifold. Then, the local discriminative projections for different clusters are simultaneously calculated by optimizing the global Fisher Criterion based on the cluster weighted data representation. Compared with traditional linear/kernel discriminant analysis (KDA) algorithms, our proposed algorithm has the following characteristics: 1) it essentially is a KDA algorithm with specific geometry-adaptive-kernel tailored to the specific data structure, in contrast to traditional KDA in which the kernel is fixed and independent to the data set; 2) it is approximately a locally linear while globally nonlinear discriminant analyzer; 3) it does not need to store the original samples for computing the low-dimensional representation of a new data; and 4) it is computationally efficient compared with traditional KDA when the sample number is large. The toy problem on artificial data demonstrates the effectiveness of our proposed algorithm in deriving discriminative representations for problems with nonlinear classification hyperplane. The face recognition experiments on YALE and CMU PIE databases show that our proposed algorithm significantly outperforms linear discriminant analysis (LDA) as well as Mixture LDA, and has higher accuracy than KDA with traditional kernels. © 2007 IEEE. | - |
dc.language | eng | - |
dc.relation.ispartof | IEEE Transactions on Circuits and Systems for Video Technology | - |
dc.subject | Kernel design | - |
dc.subject | Kernel machine | - |
dc.subject | Kernel selection | - |
dc.subject | Linear discriminant analysis (LDA) | - |
dc.subject | Manifold learning | - |
dc.subject | Principal component analysis (PCA) | - |
dc.subject | Subspace learning | - |
dc.title | Nonlinear discriminant analysis on embedded manifold | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/TCSVT.2007.893837 | - |
dc.identifier.scopus | eid_2-s2.0-34247597810 | - |
dc.identifier.volume | 17 | - |
dc.identifier.issue | 4 | - |
dc.identifier.spage | 468 | - |
dc.identifier.epage | 477 | - |
dc.identifier.isi | WOS:000246191900007 | - |