Although it is known that having accurate Lipschitz estimates is essential for certain models to deliver good predictive performance, refining this constant in practice can be a difficult task especially when the input dimension is high. In this letter, we shed light on the consequences of employing loose Lipschitz bounds in the Nonlinear Set Membership (NSM) framework, showing that the model converges to a nearest neighbor regressor (k-NN with k = 1). This convergence process is moreover not uniform, and is monotonic in the univariate case. An intuitive geometrical interpretation of the result is then given and its practical implications are discussed.