Files

Résumé

We propose two different improvements of reduced basis (RB) methods to enable the efficient and accurate evaluation of an output functional based on the numerical solution of parametrized partial differential equations with a possibly high-dimensional parameter space. The element that combines these two techniques is that they both utilize ANOVA expansions to achieve the improvements. The first method is a three-step RB-ANOVA-RB method, aiming at using a combination of reduced basis methods and ANOVA expansions to effectively compress the parameter space without impact the accuracy of the output of interest. This is achieved by first building a low-accuracy reduced model for the full high-dimensional parametric problem. This model is used to recover an approximate ANOVA expansion for the output functional at marginal cost, allowing the estimation of the sensitivity of the output functional to parameter variation and enabling a subsequent compression of the parameter space. A new accurate reduced model can then be constructed for the compressed parametric problem at a substantially lower computational cost than for the full problem. In the second approach we explore the ANOVA expansion to drive an hp reduced basis method. This is initiated by setting up a maximum number of reduced bases that can be afforded during the online stage. If the offline greedy procedure for a given parameter domain converges with equal or less than the maximum bases, the offline algorithm stops. Otherwise, an approximate ANOVA expansion is performed for the output functional. The parameter domain is decomposed into several subdomains where the most important parameters according to the ANOVA expansion are split. The offline greedy algorithms are performed in these parameter subdomains. The algorithm is applied recursively until the offline greedy algorithms converge across all parameter subdomains. We demonstrate the accuracy, efficiency, and generality of these two approaches through a number of test cases.

Détails

PDF