A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives - INRIA - Institut National de Recherche en Informatique et en Automatique Accéder directement au contenu
Article Dans Une Revue Open Journal of Mathematical Optimization Année : 2022

A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives

Résumé

In this short note, we provide a simple version of an accelerated forward-backward method (a.k.a. Nesterov's accelerated proximal gradient method) possibly relying on approximate proximal operators and allowing to exploit strong convexity of the objective function. The method supports both relative and absolute errors, and its behavior is illustrated on a set of standard numerical experiments. Using the same developments, we further provide a version of the accelerated proximal hybrid extragradient method of Monteiro and Svaiter (2013) possibly exploiting strong convexity of the objective function.

Dates et versions

hal-03377374 , version 1 (14-10-2021)

Identifiants

Citer

Mathieu Barré, Adrien Taylor, Francis Bach. A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives. Open Journal of Mathematical Optimization, 2022, ⟨10.5802/ojmo.12⟩. ⟨hal-03377374⟩
34 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More