Files

Résumé

The use of higher order autocorrelations as features for pattern classification has been usually restricted to second or third orders due to high computational costs. Since the autocorrelation space is a high dimensional space we are interested in reducing the dimensionality of feature vectors for the benefit of the pattern classification task. An established technique is Principal Component Analysis (PCA) which, however, cannot be applied directly in the autocorrelation space. In this paper we develop a new method for performing PCA in autocorrelation space, without explicitly computing the autocorrelations. The connections with the nonlinear PCA and possible extensions are also discussed.

Détails

PDF