Robust high dimensional learning for Lipschitz and convex losses - INRIA - Institut National de Recherche en Informatique et en Automatique Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2019

Robust high dimensional learning for Lipschitz and convex losses

Résumé

We establish risk bounds for Regularized Empirical Risk Minimizers (RERM) when the loss is Lipschitz and convex and the regularization function is a norm. We obtain these results in the i.i.d. setup under subgaussian assumptions on the design. In a second part, a more general framework where the design might have heavier tails and data may be corrupted by outliers both in the design and the response variables is considered. In this situation, RERM performs poorly in general. We analyse an alternative procedure based on median-of-means principles and called "minmax MOM". We show optimal subgaussian deviation rates for these estimators in the relaxed setting. The main results are meta-theorems allowing a wide-range of applications to various problems in learning theory. To show a non-exhaustive sample of these potential applications, it is applied to classification problems with logistic loss functions regularized by LASSO and SLOPE, to regression problems with Huber loss regularized by Group LASSO, Total Variation and Fused LASSO and to matrix completion problems with quantile loss regularized by the nuclear norm. A short simulation study concludes the paper, illustrating in particular robustness properties of regularized minmax MOM procedures.

Dates et versions

hal-02159943 , version 1 (19-06-2019)

Identifiants

Citer

Geoffrey Chinot, Guillaume Lecué, Matthieu Lerasle. Robust high dimensional learning for Lipschitz and convex losses. 2019. ⟨hal-02159943⟩
39 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More