Kressner, DanielLatz, JonasMassei, StefanoUllmann, Elisabeth2021-07-172021-07-172021-07-172020-12-0110.3934/fods.2020022https://infoscience.epfl.ch/handle/20.500.14299/180058WOS:000663367400005Many techniques for data science and uncertainty quantification demand efficient tools to handle Gaussian random fields, which are defined in terms of their mean functions and covariance operators. Recently, parameterized Gaussian random fields have gained increased attention, due to their higher degree of flexibility. However, especially if the random field is parameterized through its covariance operator, classical random field discretization techniques fail or become inefficient. In this work we introduce and analyze a new and certified algorithm for the low-rank approximation of a parameterized family of covariance operators which represents an extension of the adaptive cross approximation method for symmetric positive definite matrices. The algorithm relies on an affine linear expansion of the covariance operator with respect to the parameters, which needs to be computed in a preprocessing step using, e.g., the empirical interpolation method. We discuss and test our new approach for isotropic covariance kernels, such as Matern kernels. The numerical results demonstrate the advantages of our approach in terms of computational time and confirm that the proposed algorithm provides the basis of a fast sampling procedure for parameter dependent Gaussian random fields.Mathematics, AppliedStatistics & ProbabilityMathematicsadaptive cross approximationcovariance matrixgreedy algorithmwasserstein distancegaussian random fieldempirical interpolationhierarchical matricesgaussian-processesrandom-fieldsapproximationalgorithmsCertified And Fast Computations With Shallow Covariance Kernelstext::journal::journal article::research article