This paper presents a new algorithm for classifying distributions. The algorithm combines the principle of margin maximization and a kernel trick, applied to distributions. Thus, it combines the discriminative power of support vector machines and the well-developed framework of generative models. It can be applied to a number of real-life tasks which include data represented as distributions. The algorithm can also be applied for introducing some prior knowledge on invariances into a discriminative model. We illustrate this approach in details for the case of Gaussian distributions, using a toy problem. We also present experiments devoted to the real-life problem of invariant image classification.
Type
report
Author(s)
Date Issued
2005
Publisher
IDIAP
Subjects
Note
Submitted to NIPS
Written at
EPFL
EPFL units
Available on Infoscience
March 10, 2006
Use this identifier to reference this record