Files

Abstract

We propose a non-parametric regression methodology that enforces the regressor to be fully consistent with the sample set and the ground-truth regularity assumptions. As opposed to the Nonlinear Set Membership technique, this constraint guarantees the attainment of everywhere differentiable surrogate models, which are more suitable to optimization-based controllers that heavily rely on gradient computations. The presented approach is named Smooth Lipschitz Regression (SLR) and provides error bounds on the prediction error at unseen points in the space. A numerical example is given to show the effectiveness of this method when compared to the other alternatives in a Model Predictive Control setting. Copyright (C) 2020 The Authors.

Details

Actions

Preview