Infoscience

Journal article

Discrete least-squares approximations over optimized downward closed polynomial spaces in arbitrary dimension

We analyze the accuracy of the discrete least-squares approximation of a function $u$ in multivariate polynomial spaces $P_\Lambda:=span{y\mapsto y^\nu \,: \, \nu\in \Lambda}$ with $\Lambda\subset N_0^d$ over the domain $\Gamma:=[-1,1]^d$, based on the sampling of this function at points $y^1,\dots,y^m \in \Gamma$. The samples are independently drawn according to a given probability density $\rho$ belonging to the class of multivariate beta densities, which includes the uniform density as a particular case. Motivated by recent results in high-dimensional parametric and stochastic PDEs, we restrict our attention to polynomial spaces associated with downward closed sets $\Lambda$ of prescribed cardinality $n$ and we optimize the choice of the space for the given sample. This implies in particular that the selected polynomial space depends on the sample. We are interested in comparing the error of this least-squares approximation measured in $L^2(\Gamma,\rho)$ with the best achievable polynomial approximation error when using downward closed sets of cardinality $n$. We establish conditions between the dimension $n$ and the size $m$ of the sample, under which these two errors are proven to be comparable. We show that the dimension $d$ enters only moderately in the resulting trade-off between $m$ and $n$, in terms of a logarithmic factor $\ln(d)$, and is even absent when the optimization is restricted to a relevant subclass of downward closed sets. In principle, this allows one to use these methods in high dimension. Our analysis builds upon \cite{CCMNT2013} which considered fixed and non-optimized downward closed multi-index sets. Potential applications of the proposed results are found in the development and analysis of efficient numerical methods for computing the solution of high-dimensional parametric or stochastic PDEs, but is not limited to this area.

Related material