Newton acceleration on manifolds identified by proximal-gradient methods - Optimization and learning for Data Science Accéder directement au contenu
Article Dans Une Revue Mathematical Programming Année : 2023

Newton acceleration on manifolds identified by proximal-gradient methods

Gilles Bareilles
Franck Iutzeler
Jérôme Malick

Résumé

Proximal methods are known to identify the underlying substructure of nonsmooth optimization problems. Even more, in many interesting situations, the output of a proximity operator comes with its structure at no additional cost, and convergence is improved once it matches the structure of a minimizer. However, it is impossible in general to know whether the current structure is final or not; such highly valuable information has to be exploited adaptively. To do so, we place ourselves in the case where a proximal gradient method can identify manifolds of differentiability of the nonsmooth objective. Leveraging this manifold identification, we show that Riemannian Newton-like methods can be intertwined with the proximal gradient steps to drastically boost the convergence. We prove the superlinear convergence of the algorithm when solving some nondegenerated nonsmooth nonconvex optimization problems. We provide numerical illustrations on optimization problems regularized by L1-norm or trace-norm.
Fichier principal
Vignette du fichier
paper.pdf (2.52 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03197686 , version 1 (30-09-2021)
hal-03197686 , version 2 (16-12-2021)
hal-03197686 , version 3 (25-05-2022)

Identifiants

Citer

Gilles Bareilles, Franck Iutzeler, Jérôme Malick. Newton acceleration on manifolds identified by proximal-gradient methods. Mathematical Programming, 2023, 200, pp.37-70. ⟨10.1007/s10107-022-01873-w⟩. ⟨hal-03197686v3⟩
129 Consultations
68 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More