Gradient-based dimension reduction of multivariate vector-valued functions - INRIA - Institut National de Recherche en Informatique et en Automatique Accéder directement au contenu
Article Dans Une Revue SIAM Journal on Scientific Computing Année : 2020

Gradient-based dimension reduction of multivariate vector-valued functions

Résumé

Multivariate functions encountered in high-dimensional uncertainty quantification problems often vary along a few dominant directions in the input parameter space. We propose a gradient-based method for detecting these directions and using them to construct ridge approximations of such functions, in a setting where the functions are vector-valued (e.g., taking values in Rn). The methodology consists of minimizing an upper bound on the approximation error, obtained by subspace Poincaré inequalities. We provide a thorough mathematical analysis in the case where the parameter space is equipped with a Gaussian probability measure. The resulting method generalizes the notion of active subspaces associated with scalar-valued functions. A numerical illustration shows that using gradients of the function yields effective dimension reduction. We also show how the choice of norm on the codomain of the function has an impact on the function's low-dimensional approximation.
Fichier principal
Vignette du fichier
main.pdf (2.82 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01701425 , version 1 (05-02-2018)
hal-01701425 , version 2 (14-12-2018)
hal-01701425 , version 3 (08-11-2019)

Identifiants

Citer

Olivier Zahm, Paul Constantine, Clémentine Prieur, Youssef Marzouk. Gradient-based dimension reduction of multivariate vector-valued functions. SIAM Journal on Scientific Computing, 2020, 42 (1), pp.A929-A956. ⟨10.1137/18M1221837⟩. ⟨hal-01701425v3⟩
572 Consultations
784 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More