Consistent identification of NARX models via regularization networks

Generalization networks are nonparametric estimators obtained from the application of Tychonov regularization or Bayes estimation to the hypersurface reconstruction problem. Under symmetry assumptions they are a particular type of radial basis function neural networks. In the paper it is shown that such networks guarantee consistent identification of a very general (infinite dimensional) class of NARX models. The proofs are based on the theory of reproducing kernel Hilbert spaces and the notion of frequency of time probability, by means of which it is not necessary to assume that the input is sampled from a stochastic process.

Published in:
IEEE Trans. Autom. Contr. - Special Section on Neural Networks in Control, Identification, and Decision Making, 44, 11, 2045-2049

 Record created 2017-01-10, last modified 2018-03-17

Rate this document:

Rate this document:
(Not yet reviewed)