File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Parallel image matrix compression for face recognition

TitleParallel image matrix compression for face recognition
Authors
Issue Date2005
Citation
Proceedings of the 11th International Multimedia Modelling Conference, MMM 2005, 2005, p. 232-238 How to Cite?
AbstractThe canonical face recognition algorithm Eigenface and Fisherface are both based on one dimensional vector representation. However, with the high feature dimensions and the small training data, face recognition often suffers from the curse of dimension and the small sample problem. Recent research [4] shows that face recognition based on direct 2D matrix representation, i.e. 2DPCA, obtains better performance than that based on traditional vector representation. However, there are three questions left unresolved in the 2DPCA algorithm: I ) what is the meaning of the eigenvalue and eigenvector of the covariance matrix in 2DPCA; 2) why 2DPCA can outperform Eigenface; and 3) how to reduce the dimension after 2DPCA directly. In this paper, we analyze 2DPCA in a different view and proof that is 2DPCA actually a "localized" PCA with each row vector of an image as object. With this explanation, we discover the intrinsic reason that 2DPCA can outperform Eigenface is because fewer feature dimensions and more samples are used in 2DPCA when compared with Eigenface. To further reduce the dimension after 2DPCA, a two-stage strategy, namely parallel image matrix compression (PIMC), is proposed to compress the image matrix redundancy, which exists among row vectors and column vectors. The exhaustive experiment results demonstrate that PIMC is superior to 2DPCA and Eigenface, and PIMC+LDA outperforms 2DPC+LDA and Fisherface. © 2005 IEEE.
Persistent Identifierhttp://hdl.handle.net/10722/321361

 

DC FieldValueLanguage
dc.contributor.authorXu, Dong-
dc.contributor.authorYan, Shuicheng-
dc.contributor.authorZhang, Lei-
dc.contributor.authorLi, Mingjing-
dc.contributor.authorMa, Weiying-
dc.contributor.authorLiu, Zhengkai-
dc.contributor.authorZhang, Hongjiang-
dc.date.accessioned2022-11-03T02:18:23Z-
dc.date.available2022-11-03T02:18:23Z-
dc.date.issued2005-
dc.identifier.citationProceedings of the 11th International Multimedia Modelling Conference, MMM 2005, 2005, p. 232-238-
dc.identifier.urihttp://hdl.handle.net/10722/321361-
dc.description.abstractThe canonical face recognition algorithm Eigenface and Fisherface are both based on one dimensional vector representation. However, with the high feature dimensions and the small training data, face recognition often suffers from the curse of dimension and the small sample problem. Recent research [4] shows that face recognition based on direct 2D matrix representation, i.e. 2DPCA, obtains better performance than that based on traditional vector representation. However, there are three questions left unresolved in the 2DPCA algorithm: I ) what is the meaning of the eigenvalue and eigenvector of the covariance matrix in 2DPCA; 2) why 2DPCA can outperform Eigenface; and 3) how to reduce the dimension after 2DPCA directly. In this paper, we analyze 2DPCA in a different view and proof that is 2DPCA actually a "localized" PCA with each row vector of an image as object. With this explanation, we discover the intrinsic reason that 2DPCA can outperform Eigenface is because fewer feature dimensions and more samples are used in 2DPCA when compared with Eigenface. To further reduce the dimension after 2DPCA, a two-stage strategy, namely parallel image matrix compression (PIMC), is proposed to compress the image matrix redundancy, which exists among row vectors and column vectors. The exhaustive experiment results demonstrate that PIMC is superior to 2DPCA and Eigenface, and PIMC+LDA outperforms 2DPC+LDA and Fisherface. © 2005 IEEE.-
dc.languageeng-
dc.relation.ispartofProceedings of the 11th International Multimedia Modelling Conference, MMM 2005-
dc.titleParallel image matrix compression for face recognition-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/MMMC.2005.57-
dc.identifier.scopuseid_2-s2.0-56149118158-
dc.identifier.spage232-
dc.identifier.epage238-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats