Files

Abstract

Kernelized Support Vector Machines (SVM) have gained the status of o-the-shelf classiers, able to deliver state of the art performance on almost any problem. Still, their practical use is constrained by their computational and memory complexity, which grows super-linearly with the number of training samples. In order to retain the low training and testing complexity of linear classiers and the exibility of non linear ones, a growing, promising alternative is represented by methods that learn non-linear classiers through local combinations of linear ones. In this paper we propose a new multi class local classier, based on a latent SVM formulation. The proposed classier makes use of a set of linear models that are linearly combined using sample and class specic weights. Thanks to the latent formulation, the combination coecients are modeled as latent variables. We allow soft combinations and we provide a closed-form solution for their estimation, resulting in an ecient prediction rule. This novel formulation allows to learn in a principled way the sample specic weights and the linear classiers, in a unique optimization problem, using a CCCP optimization procedure. Extensive experiments on ten standard UCI machine learning datasets, one large binary dataset, three character and digit recognition databases, and a visual place categorization dataset show the power of the proposed approach.

Details

Actions

Preview