Explore Aggressively, Update Conservatively: Stochastic Extragradient Methods with Variable Stepsize Scaling - INRIA - Institut National de Recherche en Informatique et en Automatique Accéder directement au contenu
Communication Dans Un Congrès Année : 2020

Explore Aggressively, Update Conservatively: Stochastic Extragradient Methods with Variable Stepsize Scaling

Résumé

Owing to their stability and convergence speed, extragradient methods have become a staple for solving large-scale saddle-point problems in machine learning. The basic premise of these algorithms is the use of an extrapolation step before performing an update; thanks to this exploration step, extragradient methods overcome many of the non-convergence issues that plague gradient descent/ascent schemes. On the other hand, as we show in this paper, running vanilla extragradient with stochastic gradients may jeopardize its convergence, even in simple bilinear models. To overcome this failure, we investigate a double stepsize extragradient algorithm where the exploration step evolves at a more aggressive timescale compared to the update step. We show that this modification allows the method to converge even with stochastic gradients, and we derive sharp convergence rates under an error bound condition.
Fichier principal
Vignette du fichier
dseg_supp.pdf (8.28 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-03002844 , version 1 (13-11-2020)

Identifiants

  • HAL Id : hal-03002844 , version 1

Citer

Yu-Guan Hsieh, Franck Iutzeler, Jérôme Malick, Panayotis Mertikopoulos. Explore Aggressively, Update Conservatively: Stochastic Extragradient Methods with Variable Stepsize Scaling. NeurIPS '20 - 34th International Conference on Neural Information Processing Systems, Dec 2020, Vancouver / Virtual, Canada. pp.16223--16234. ⟨hal-03002844⟩
172 Consultations
214 Téléchargements

Partager

Gmail Facebook X LinkedIn More