Polynomial autoregressions have been most of the time discarded as being unrealistic. Indeed, for such processes to be stationary, strong assumptions on the parameters and on the noise are necessary. For example, the distribution of the latter has to have finite support. Nevertheless, the use of polynomials has been advocated by a few authors: Cox (1991), Cobb and Zacks (1988) and Chan and Tong (1994). From a model-free perspective, that is using parametric families of functions (e.g. Fourier series, wavelets, neural networks, MARS, ...) as approximators of the optimal predictor, there is no more concern about stationarity. Still, polynomials have not been used within this framework; the reason is the now well known "curse of dimensionality". Indeed, when high lag orders are used to forecast, a common situation in time series, polynomial autoregressions imply too many parameters to estimate. We will introduce a new family of predictors based on polynomials and on a projection scheme. A censoring procedure allows to solve the instability inherent to polynomials. The forecasting method is parsimonious (that is non-linearity is introduced with a small amount of parameters) and thus allows to forecast noisy time series of short to moderate length.