File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Characterization of all solutions for undersampled uncorrelated linear discriminant analysis problems

TitleCharacterization of all solutions for undersampled uncorrelated linear discriminant analysis problems
Authors
KeywordsData Dimensionality Reduction
QR Factorization
Uncorrelated Linear Discriminant Analysis
Issue Date2011
PublisherSociety for Industrial and Applied Mathematics. The Journal's web site is located at http://www.siam.org/journals/simax.php
Citation
SIAM Journal On Matrix Analysis And Applications, 2011, v. 32 n. 3, p. 820-844 How to Cite?
AbstractIn this paper the uncorrelated linear discriminant analysis (ULDA) for undersampled problems is studied. The main contributions of the present work include the following: (i) all solutions of the optimization problem used for establishing the ULDA are parameterized explicitly; (ii) the optimal solutions among all solutions of the corresponding optimization problem are characterized in terms of both the ratio of between-class distance to within-class distance and the maximum likelihood classification, and it is proved that these optimal solutions are exactly the solutions of the corresponding optimization problem with minimum Frobenius norm, also minimum nuclear norm; these properties provide a good mathematical justification for preferring the minimum-norm transformation over other possible solutions as the optimal transformation in ULDA; (iii) explicit necessary and sufficient conditions are provided to ensure that these minimal solutions lead to a larger ratio of between-class distance to within-class distance, thereby achieving larger discrimination in the reduced subspace than that in the original data space, and our numerical experiments show that these necessary and sufficient conditions hold true generally. Furthermore, a new and fast ULDA algorithm is developed, which is eigendecomposition-free and SVD-free, and its effectiveness is demonstrated by some real-world data sets. © 2011 Society for Industrial and Applied Mathematics.
Persistent Identifierhttp://hdl.handle.net/10722/155676
ISSN
2021 Impact Factor: 1.908
2020 SCImago Journal Rankings: 1.268
ISI Accession Number ID
Funding AgencyGrant Number
NUSR-146-000-140-112
Funding Information:

The work of these authors was supported by NUS research grant R-146-000-140-112.

References

 

DC FieldValueLanguage
dc.contributor.authorChu, Den_US
dc.contributor.authorGoh, STen_US
dc.contributor.authorHung, YSen_US
dc.date.accessioned2012-08-08T08:34:47Z-
dc.date.available2012-08-08T08:34:47Z-
dc.date.issued2011en_US
dc.identifier.citationSIAM Journal On Matrix Analysis And Applications, 2011, v. 32 n. 3, p. 820-844en_US
dc.identifier.issn0895-4798en_US
dc.identifier.urihttp://hdl.handle.net/10722/155676-
dc.description.abstractIn this paper the uncorrelated linear discriminant analysis (ULDA) for undersampled problems is studied. The main contributions of the present work include the following: (i) all solutions of the optimization problem used for establishing the ULDA are parameterized explicitly; (ii) the optimal solutions among all solutions of the corresponding optimization problem are characterized in terms of both the ratio of between-class distance to within-class distance and the maximum likelihood classification, and it is proved that these optimal solutions are exactly the solutions of the corresponding optimization problem with minimum Frobenius norm, also minimum nuclear norm; these properties provide a good mathematical justification for preferring the minimum-norm transformation over other possible solutions as the optimal transformation in ULDA; (iii) explicit necessary and sufficient conditions are provided to ensure that these minimal solutions lead to a larger ratio of between-class distance to within-class distance, thereby achieving larger discrimination in the reduced subspace than that in the original data space, and our numerical experiments show that these necessary and sufficient conditions hold true generally. Furthermore, a new and fast ULDA algorithm is developed, which is eigendecomposition-free and SVD-free, and its effectiveness is demonstrated by some real-world data sets. © 2011 Society for Industrial and Applied Mathematics.en_US
dc.languageengen_US
dc.publisherSociety for Industrial and Applied Mathematics. The Journal's web site is located at http://www.siam.org/journals/simax.php-
dc.relation.ispartofSIAM Journal on Matrix Analysis and Applicationsen_US
dc.rights© 2011 Society for Industrial and Applied Mathematics. First Published in SIAM Journal on Matrix Analysis and Applications in volume 32, issue 3, published by the Society for Industrial and Applied Mathematics (SIAM).-
dc.subjectData Dimensionality Reductionen_US
dc.subjectQR Factorizationen_US
dc.subjectUncorrelated Linear Discriminant Analysisen_US
dc.titleCharacterization of all solutions for undersampled uncorrelated linear discriminant analysis problemsen_US
dc.typeArticleen_US
dc.identifier.emailHung, YS:yshung@eee.hku.hken_US
dc.identifier.authorityHung, YS=rp00220en_US
dc.description.naturepublished_or_final_versionen_US
dc.identifier.doi10.1137/100792007en_US
dc.identifier.scopuseid_2-s2.0-80054044731en_US
dc.identifier.hkuros206381-
dc.relation.referenceshttp://www.scopus.com/mlt/select.url?eid=2-s2.0-80054044731&selection=ref&src=s&origin=recordpageen_US
dc.identifier.volume32en_US
dc.identifier.issue3en_US
dc.identifier.spage820en_US
dc.identifier.epage844en_US
dc.identifier.isiWOS:000295399200009-
dc.publisher.placeUnited Statesen_US
dc.identifier.scopusauthoridChu, D=7201734138en_US
dc.identifier.scopusauthoridGoh, ST=36348183400en_US
dc.identifier.scopusauthoridHung, YS=8091656200en_US
dc.identifier.issnl0895-4798-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats