Dissertation
Dissertation > Industrial Technology > Radio electronics, telecommunications technology > Radar > Radar equipment,radar > Radar receiving equipment

Radar HRRP Target Recognition Based on Kernel Methods

Author ChenBo
Tutor BaoZuo
School Xi'an University of Electronic Science and Technology
Course Signal and Information Processing
Keywords Radar automatic target recognition (RATR) High-resolution range profile (HRRP) Target-aspect sensitivity Time-shift sensitivity Kernel function Kernel optimization Feature extraction Fisher criteria Support vector machine (SVM) Manifold learning Large margin Feature selection
CLC TN957.5
Type PhD thesis
Year 2008
Downloads 666
Quotes 15
Download Dissertation

Target high-resolution range profile (HRRP) represents the projection of the complex returned echoes from the target scattering centers onto the radar line-of-sight (LOS), which contains the informative target structure signatures. Furthermore, different from the target recognition using radar target images including SAR and ISAR images, it is unnecessary for HRRP target recognition to require a rotation angle between the target and radar. It means that obtaining target HRRP is easier and the technique is more applicable to many types of radar. Consequently, radar HRRP target recognition has received intensive attention from the radar automatic target recognition (RATR) community. Kernel methods have been successfully applied in solving various problems in machine learning community. Kernel methods are algorithms that, by replacing the inner product with an appropriate positive definite function (kernel function), implicitly perform a nonlinear mapping of input data to a high dimensional feature space. The attractiveness of such algorithms stems from their elegant treatment of nonlinear problems and their efficiency in high-dimensional problems. Due to the non-cooperative and maneuvering, the different targets should be nonlinear separable. Therefore, from the three aspects, i.e. kernel feature extraction, the classifier designing and kernel optimization, this dissertation provide our researches for HRRP target recognition, which are supported by Advanced Defense Research Programs of China (No. 413070501 and No. 51307060601) and National Science Foundation of China (No. 60302009).The main content of this dissertation is summarized as follows.1. The first part summarily reviews the background of kernel methods and introduces the fundamental theories and characteristics. Moreover, in order to demonstrate the better performance of kernel methods than linear methods, an improved kernel principle component analysis (Kernel PCA) is proposed to deal with the target-aspect, time-shift and amplitude-scale sensitivity of HRRP samples.2. LDA is a popular method for linear dimensionality reduction, which maximizes between-class scatter and minimizes within-class scatter. However, LDA is optimal only in the case that all the classes are generated from underlying multivariate Normal distributions of common covariance matrix but different means and each class is expressed by a single cluster. In order to overcome the limitations of LDA, recently subclass discriminant analysis (SDA) is proposed. We develop SDA into Kernel SDA (KSDA) in the feature space, which can result in a better subspace for the classification task since a nonlinear clustering technique can find the underlying subclasses more exactly in the feature space and nonlinear LDA can provide a nonlinear discriminant hyperplane. Furthermore, a reformulation of SDA is given to avoid the complicated derivation in the feature space.3. The third section focuses on simply and efficiently reducing the number of support vectors (SV) in the decision function of support vector machine (SVM) to speedup the SVM decision procedure. SVM is currently considerably slower in test phase than other approaches with similar generalization performance, which restricts its application to real-time tasks. Because in practice the embedded data just lie into a subspace of the kernel-induced high dimensional space, we can search a set of basis vectors (BV) to express all the SVs approximately, the number of which is usually less than that of SVs.4. The forth section is contributed on the optimization of kernel functions. The main work concerns the following three aspects: (1) we develop a kernel optimization method based on fusion kernel for High-resolution range profile (HRRP). In terms of the fusion of l1-norm and l2-norm Gaussian kernels, our method combines the different characteristics of them so that not only is the kernel function optimized but also the speckle fluctuations of HRRP are restrained; (2) ideally it is expected that the data is linearly separable in the kernel induced feature space, therefore, Fisher linear discriminant criterion can be used as a cost function to optimize the kernel function. However, the data may not be linearly separable even after kernel transformation in many applications, e.g., the data may exist as multimodally distributed structure, in this case, a nonlinear classifier is preferred, and obviously Fisher criterion is not a suitable choice as kernel optimization rule. Motivated by this issue, we propose a localized kernel Fisher criterion, instead of traditional Fisher criterion, as the kernel optimization rule to increase the local margins between embedded classes in kernel induced feature space, which results in a better classification performance of kernel-based classifiers; (3) Fisher criteria is optimal only in the case that all the classes are generated from underlying multivariate Normal distributions of common covariance matrix but different means and each class is expressed by a single cluster. Due to the assumptions, Fisher criteria obviously is not a suitable choice as a kernel optimization rule in some applications. In order to solve this problem, recently many improved discriminant criteria (DC) have been also developed. Therefore, to apply these discriminant criteria to kernel optimization, in this paper based on a data-dependent kernel function we propose a unified kernel optimization framework, which can use any discriminant criteria formulated in a pairwise manner as the objective functions. Under the kernel optimization framework, if one would like to employ different discriminant criteria, what to do only is to change the corresponding affinity matrices without having to resort to any complex derivations in feature space.5. In the fifth part, the manifold geometry in radar HRRP is firstly explored. And then according to the characteristics of target pose sensitivity, a method of adaptively segmenting the aspect sectors is proposed for HRRP recognition through evaluating the curvature of HRRP manifold.6. Finally, we propose a novel feature selection algorithm. The problem of feature selection is a difficult combinatorial task in Machine Learning and of high practical relevance. In this paper we present a large margin feature weighting method for k -nearest neighbor (kNN) classifier. The method learns the feature weighting factors by minimizing a cost function, which aims at separating points in different classes by a large margin, pulling closer together points from the same class and using as few features as possible. The consequent optimization problem can efficiently be solved by Linear Programming.

Related Dissertations
More Dissertations