Conservative set valued fields, automatic differentiation, stochastic gradient method and deep learning - Université Toulouse III - Paul Sabatier - Toulouse INP Accéder directement au contenu
Article Dans Une Revue Mathematical Programming Année : 2020

Conservative set valued fields, automatic differentiation, stochastic gradient method and deep learning

Résumé

Modern problems in AI or in numerical analysis require nonsmooth approaches with a flexible calculus. We introduce generalized derivatives called conservative fields for which we develop a calculus and provide representation formulas. Functions having a conservative field are called path differentiable: convex, concave, Clarke regular and any semialgebraic Lipschitz continuous functions are path differentiable. Using Whitney stratification techniques for semialgebraic and definable sets, our model provides variational formulas for nonsmooth automatic differentiation oracles, as for instance the famous backpropagation algorithm in deep learning. Our differential model is applied to establish the convergence in values of nonsmooth stochastic gradient methods as they are implemented in practice.

Dates et versions

hal-02521848 , version 1 (27-03-2020)

Identifiants

Citer

Jérôme Bolte, Edouard Pauwels. Conservative set valued fields, automatic differentiation, stochastic gradient method and deep learning. Mathematical Programming, 2020, 188 (19-51), ⟨10.1007/s10107-020-01501-5⟩. ⟨hal-02521848⟩
219 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More