Some Novel Classifiers and Their Applications on Face Recognition
|School||Nanjing University of Technology and Engineering|
|Course||Applied Computer Technology|
|Keywords||classifier pattern recognition face recognition nearest neighbor convex hull kernel method kernel nearest neighbor convex hull sample selection subspace sample selection kernel subspace sample selection affine subspace nearest points kernel affine subspace nearest points|
In pattern recognition system, the role of the classifier is to predict the label of the test data according to the traning data. Designing a powerful classifier on the known informations such as training samples is an important task for pattern recognition system.A novel classifier named Nearest Neighbor Convex Hull （NNCH） classifier for high dimensional data classification is presented in this paper. In NNCH classifier, a convex hull of training samples of a class is taken as the distribution estimation of the class, and the distance from a test sample to the convex hull is taken as the similarity measure for classification. According to the nearest neighbor rule, a test sample will be classified to the class of the nearest convex hull. NNCH classifier is applied on face recognition, and the experimental results demonstrate the effectiveness of the classifier.As an important distance measure, norm is widely used in pattern classification. In general, under different distance measurement definition, the same classifiers will have different performances. This paper presents l1 norm nearest neighbor convex hull （l1 NNCH） classifier, which replaces the l2 norm of NNCH algorithm with l1 norm. The l1 distance from a test point to the convex hull of a class training set is taken as the similarity measurement of the nearest neighbor rule. The experiments on the ORL face database and the NUST603 face database show good performance of this mehtod.A novel pattern classification algorithm based on the kernel method named Kernel Nearest Neighbor Convex Hull （KNNCH） classifier is presented in this paper. The data from the input space are projected into a higher dimensional feature space by replacing the inner product in NNCH with an appropriately chosen kernel function, so that the NNCH classifier is constructed for classification tasks in the higher dimension feature space. The experiments on face recognition show good performance of this new method.Classifying a test sample by NNCH classifier involves computing convex quadratic programming problems of the distance from the test sample to every class convex hulls, which require large memory and long computation time for large-scale datasets. Therefore, it is important for NNCH classifier to reduce the computation complexity without degrading the prediction accuracy. In this paper, a sample selection method named subspace sample selection （SSS） algorithm is used to select a subset of data for NNCH classifier. SSS algorithm is an iterative algorithm in one class, which selects the furthest sample to the subspace of the chosen set at each step. The experiments on the training-synthetic subset of the MIT-CBCL face recognition database show that our NNCH classifier based on SSS approach could reach 100% recognition rate with less samples and much faster test speed.KNNCH classifier also involves solving convex quadratic programming problems. When training set is large, reducing data is also necessary for it. This paper uses a named kernel subspace sample selection （KSSS） method to choose training samples. KSSS applies kernel trick in SSS algorithm, and extended the SSS algorithm to the kernel space. The experiments on the training-synthetic subset of the MIT-CBCL face recognition database show that our KSSS+KNNCH approach could reach 100% recognition rate with less samples and much faster test speed than KNNCH.A novel linear classifier called Affine Subspace Nearest Points （ASNP） classifier is presented in this paper. Inspired by the geometrical explanation of Support Vector Machine （SVM） that the optimal separating plane bisects the closest points within two class convex hulls, ASNP classifier expands the searching areas of the closest points from the convex hulls to their corresponding class affine subspaces. The affine subspaces are taken as the rough estimations of the class sample distributions, and their closest points are found. Then, the hyperplane to separate the affine subspaces with the maximal margins is constructed, which is the perpendicular bisector of the line segment joining the two closest points. The test experiments compared with the Nearest Neighbor classifier and SVM on the ORL,Yale and Harvard face database show good performance of this algorithm.This paper also presents kernel affine subspace nearest points （KASNP） classifier, which extends ASNP classifier with kernel method. The experiments on face recognition show good performance of the combination method.