File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Nonlinear discriminant analysis on embedded manifold

TitleNonlinear discriminant analysis on embedded manifold
Authors
KeywordsKernel design
Kernel machine
Kernel selection
Linear discriminant analysis (LDA)
Manifold learning
Principal component analysis (PCA)
Subspace learning
Issue Date2007
Citation
IEEE Transactions on Circuits and Systems for Video Technology, 2007, v. 17, n. 4, p. 468-477 How to Cite?
AbstractTraditional manifold learning algorithms, such as ISOMAP, LLE, and Laplacian Eigenmap, mainly focus on un-covering the latent low-dimensional geometry structure of the training samples in an unsupervised manner where useful class information is ignored. Therefore, the derived low-dimensional representations are not necessarily optimal in discriminative capability. In this paper, we study the discriminant analysis problem by considering the nonlinear manifold structure of data space. To this end, firstly, a new clustering algorithm, called Infra-Cluster Balanced K-Means (ICBKM), is proposed to partition the samples into multiple clusters while ensure that there are balanced samples for the classes within each cluster; approximately, each cluster can be considered as a local patch on the embedded manifold. Then, the local discriminative projections for different clusters are simultaneously calculated by optimizing the global Fisher Criterion based on the cluster weighted data representation. Compared with traditional linear/kernel discriminant analysis (KDA) algorithms, our proposed algorithm has the following characteristics: 1) it essentially is a KDA algorithm with specific geometry-adaptive-kernel tailored to the specific data structure, in contrast to traditional KDA in which the kernel is fixed and independent to the data set; 2) it is approximately a locally linear while globally nonlinear discriminant analyzer; 3) it does not need to store the original samples for computing the low-dimensional representation of a new data; and 4) it is computationally efficient compared with traditional KDA when the sample number is large. The toy problem on artificial data demonstrates the effectiveness of our proposed algorithm in deriving discriminative representations for problems with nonlinear classification hyperplane. The face recognition experiments on YALE and CMU PIE databases show that our proposed algorithm significantly outperforms linear discriminant analysis (LDA) as well as Mixture LDA, and has higher accuracy than KDA with traditional kernels. © 2007 IEEE.
Persistent Identifierhttp://hdl.handle.net/10722/321953
ISSN
2023 Impact Factor: 8.3
2023 SCImago Journal Rankings: 2.299
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorYan, Shuicheng-
dc.contributor.authorHu, Yuxiao-
dc.contributor.authorXu, Dong-
dc.contributor.authorZhang, Hong Jiang-
dc.contributor.authorZhang, Benyu-
dc.contributor.authorCheng, Qiansheng-
dc.date.accessioned2022-11-03T02:22:35Z-
dc.date.available2022-11-03T02:22:35Z-
dc.date.issued2007-
dc.identifier.citationIEEE Transactions on Circuits and Systems for Video Technology, 2007, v. 17, n. 4, p. 468-477-
dc.identifier.issn1051-8215-
dc.identifier.urihttp://hdl.handle.net/10722/321953-
dc.description.abstractTraditional manifold learning algorithms, such as ISOMAP, LLE, and Laplacian Eigenmap, mainly focus on un-covering the latent low-dimensional geometry structure of the training samples in an unsupervised manner where useful class information is ignored. Therefore, the derived low-dimensional representations are not necessarily optimal in discriminative capability. In this paper, we study the discriminant analysis problem by considering the nonlinear manifold structure of data space. To this end, firstly, a new clustering algorithm, called Infra-Cluster Balanced K-Means (ICBKM), is proposed to partition the samples into multiple clusters while ensure that there are balanced samples for the classes within each cluster; approximately, each cluster can be considered as a local patch on the embedded manifold. Then, the local discriminative projections for different clusters are simultaneously calculated by optimizing the global Fisher Criterion based on the cluster weighted data representation. Compared with traditional linear/kernel discriminant analysis (KDA) algorithms, our proposed algorithm has the following characteristics: 1) it essentially is a KDA algorithm with specific geometry-adaptive-kernel tailored to the specific data structure, in contrast to traditional KDA in which the kernel is fixed and independent to the data set; 2) it is approximately a locally linear while globally nonlinear discriminant analyzer; 3) it does not need to store the original samples for computing the low-dimensional representation of a new data; and 4) it is computationally efficient compared with traditional KDA when the sample number is large. The toy problem on artificial data demonstrates the effectiveness of our proposed algorithm in deriving discriminative representations for problems with nonlinear classification hyperplane. The face recognition experiments on YALE and CMU PIE databases show that our proposed algorithm significantly outperforms linear discriminant analysis (LDA) as well as Mixture LDA, and has higher accuracy than KDA with traditional kernels. © 2007 IEEE.-
dc.languageeng-
dc.relation.ispartofIEEE Transactions on Circuits and Systems for Video Technology-
dc.subjectKernel design-
dc.subjectKernel machine-
dc.subjectKernel selection-
dc.subjectLinear discriminant analysis (LDA)-
dc.subjectManifold learning-
dc.subjectPrincipal component analysis (PCA)-
dc.subjectSubspace learning-
dc.titleNonlinear discriminant analysis on embedded manifold-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/TCSVT.2007.893837-
dc.identifier.scopuseid_2-s2.0-34247597810-
dc.identifier.volume17-
dc.identifier.issue4-
dc.identifier.spage468-
dc.identifier.epage477-
dc.identifier.isiWOS:000246191900007-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats