Laplacian Support Vector Analysis for Subspace Discriminative Learning
In this paper we propose a novel dimensionality reduction method that is based on successive Laplacian SVM projections in orthogonal deflated subspaces. The proposed method, called Laplacian Support Vector Analysis, produces projection vectors, which capture the discriminant information that lies in the subspace orthogonal to the standard Laplacian SVMs. We show that the optimal vectors on these deflated subspaces can be computed by successively training a standard SVM with specially designed deflation kernels. The resulting normal vectors contain discriminative information that can be used for feature extraction. In our analysis, we derive an explicit form for the deflation matrix of the mapped features in both the initial and the Hilbert space by using the kernel trick and thus, we can handle linear and non-linear deflation transformations. Experimental results in several benchmark datasets illustrate the strength of our proposed algorithm.
icpr_2014_1.pdf
Publisher's version
openaccess
254.42 KB
Adobe PDF
1e6c1fc7fe44b2073297b791ca647fa6
icpr_2014.pdf
openaccess
288.17 KB
Adobe PDF
6ca5c1a275ad0142c0de212077543bea