Files

Abstract

Combining several classifiers has proved to be an efficient machine learning technique. Two concepts influence clearly the efficiency of an ensemble: the diversity between classifiers and the individual accuracies of the classifiers. We use an information theoretic framework to establish a link between these quantities and as they appear to be contradictory, we propose an information theoretic measure that express a trade-off between individual accuracy and diversity. This technique can be directly adapted for the selection of an ensemble in a pool of classifiers. We then consider the particular case of multiple Support Vector Machines using this new measure. We will cover genetic algorithm optimization as well as a adaptation of the Kernel-Adatron algorithm to online learning of multiple SVMs. The results are compared to standard multiple SVMs techniques.

Details

Actions

Preview