Résumé

Generalization networks are nonparametric estimators obtained from the application of Tychonov regularization or Bayes estimation to the hypersurface reconstruction problem. Under symmetry assumptions they are a particular type of radial basis function neural networks. In the paper it is shown that such networks guarantee consistent identification of a very general (infinite dimensional) class of NARX models. The proofs are based on the theory of reproducing kernel Hilbert spaces and the notion of frequency of time probability, by means of which it is not necessary to assume that the input is sampled from a stochastic process.

Détails

Actions