Boosting performance for 2D linear discriminant analysis via regression
MetadataShow full item record
Two dimensional linear discriminant analysis (2DLDA) has received much interest in recent years. However, 2DLDA could make pairwise distances between any two classes become significantly unbalanced, which may affect its performance. Moreover 2DLDA could also suffer from the small sample size problem. Based on these observations, we propose two novel algorithms called regularized 2DLDA and Ridge Regression for 2DLDA (RR-2DLDA). Regularized 2DLDA is an extension of 2DLDA with the introduction of a regularization parameter to deal with the small sample size problem. RR-2DLDA integrates ridge regression into Regularized 2DLDA to balance the distances among different classes after the transformation. These proposed algorithms overcome the limitations of 2DLDA and boost recognition accuracy. The experimental results on the Yale, PIE and FERET databases showed that RR-2DLDA is superior not only to 2DLDA but also other state-of-the-art algorithms.
Showing items related by title, author, creator and subject.
The matrix form for weighted linear discriminant analysis and fractional linear discriminant analysisXu, T.; Lu, C.; Liu, Wan-quan (2009)In this paper we will extend the recently proposed weighted linear discriminant analysis (W_LDA) and fraction-step linear discriminant analysis (F_LDA) from one dimension vector form to the case of two dimension matrix ...
Lu, C.; An, Senjian; Liu, Wan-quan; Liu, X. (2009)Two Dimensional Linear Discrimination Analysis (2DLDA) is an effective feature extraction approach for face recognition, which manipulates on the two dimensional image matrices directly. However, some between-class distances ...
Lu, C.; An, Senjian; Liu, Wan-Quan; Liu, X. (2011)Two Dimensional Linear Discrimination Analysis (2DLDA) is an effective feature extraction approach for face recognition, which manipulates on the two dimensional image matrices directly. However, some between-class distances ...