Equi-normalization of Neural Networks - INRIA - Institut National de Recherche en Informatique et en Automatique Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

Equi-normalization of Neural Networks

Résumé

Modern neural networks are over-parametrized. In particular, each rectified linear hidden unit can be modified by a multiplicative factor by adjusting input and output weights, without changing the rest of the network. Inspired by the Sinkhorn-Knopp algorithm, we introduce a fast iterative method for minimizing the L2 norm of the weights, equivalently the weight decay regularizer. It provably converges to a unique solution. Interleaving our algorithm with SGD during training improves the test accuracy. For small batches, our approach offers an alternative to batch-and group-normalization on CIFAR-10 and ImageNet with a ResNet-18.
Fichier principal
Vignette du fichier
paper.pdf (1.15 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02050408 , version 1 (27-02-2019)

Identifiants

Citer

Pierre Stock, Benjamin Graham, Rémi Gribonval, Hervé Jégou. Equi-normalization of Neural Networks. ICLR 2019 - Seventh International Conference on Learning Representations, May 2019, New Orleans, United States. pp.1-20. ⟨hal-02050408⟩
149 Consultations
112 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More