World Scientific
  • Search
  •   
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at [email protected] for any enquiries.
SPECIAL ISSUE: Intelligent Perception and Classification; Edited by F. H.-F. Leung and L. L.-W. ChanNo Access

EFFICIENT PSEUDOINVERSE LINEAR DISCRIMINANT ANALYSIS AND ITS NONLINEAR FORM FOR FACE RECOGNITION

    https://doi.org/10.1142/S0218001407005946Cited by:21 (Source: Crossref)

    Pseudoinverse Linear Discriminant Analysis (PLDA) is a classical and pioneer method that deals with the Small Sample Size (SSS) problem in LDA when applied to such applications as face recognition. However, it is expensive in computation and storage due to direct manipulation on extremely large d × d matrices, where d is the dimension of the sample image. As a result, although frequently cited in literature, PLDA is hardly compared in terms of classification performance with the newly proposed methods. In this paper, we propose a new feature extraction method named RSw + LDA, which is (1) much more efficient than PLDA in both computation and storage; and (2) theoretically equivalent to PLDA, meaning that it produces the same projection matrix as PLDA. Further, to make PLDA deal better with data of nonlinear distribution, we propose a Kernel PLDA (KPLDA) method with the well-known kernel trick. Finally, our experimental results on AR face dataset, a challenging dataset with variations in expression, lighting and occlusion, show that PLDA (or RSw + LDA) can achieve significantly higher classification accuracy than the recently proposed Linear Discriminant Analysis via QR decomposition and Discriminant Common Vectors, and KPLDA can yield better classification performance compared to PLDA and Kernel PCA.

    References