Laplacian Support Vector Analysis for Subspace Discriminative Learning

In this paper we propose a novel dimensionality reduction method that is based on successive Laplacian SVM projections in orthogonal deflated subspaces. The proposed method, called Laplacian Support Vector Analysis, produces projection vectors, which capture the discriminant information that lies in the subspace orthogonal to the standard Laplacian SVMs. We show that the optimal vectors on these deflated subspaces can be computed by successively training a standard SVM with specially designed deflation kernels. The resulting normal vectors contain discriminative information that can be used for feature extraction. In our analysis, we derive an explicit form for the deflation matrix of the mapped features in both the initial and the Hilbert space by using the kernel trick and thus, we can handle linear and non-linear deflation transformations. Experimental results in several benchmark datasets illustrate the strength of our proposed algorithm.

Published in:
Proceedings of the 22nd International Conference on Pattern Recognition, 1609-1614
Presented at:
22nd International Conference on Pattern Recognition, Stockholm, Sweden, August 24-28, 2014

 Record created 2015-10-03, last modified 2019-12-05

Publisher's version:
Download fulltextPDF
Download fulltextPDF
Rate this document:

Rate this document:
(Not yet reviewed)