# Toward Fast Transform Learning

1 SC
IRIT - Institut de recherche en informatique de Toulouse
Abstract : Dictionary learning is a matrix factorization problem. It aims at finding a dictionary of atoms that best represents an image according to a given objective. A usual objective consists of representing an image or a class of images sparsely which has led to many state-of-the-art algorithms in image processing. In practice, all algorithms performing dictionary learning iteratively estimate the dictionary and a sparse representation of the images using this dictionary. However, the numerical complexity of dictionary learning restricts its use to atoms with a small support since the computations using the constructed dictionaries require too much resources to be deployed for large scale applications. In order to alleviate these issues, this paper introduces a new strategy to learn dictionaries composed of atoms obtained by translating the composition of $K$ convolutions with $S$-sparse kernels of known support. The dictionary update step associated with this strategy is a non-convex optimization problem. The purpose of the present paper is to study this non-convex optimization problem. We first reformulate the problem to reduce the number of its irrelevant stationary points. A Gauss-Seidel type algorithm, referred to as Alternative Least Square Algorithm, is introduced for its resolution. The search space of the considered optimization problem is of dimension $KS$, which is typically smaller than the size of the target atom and is much smaller than the size of the image. Moreover, the complexity of the algorithm is linear with respect to the size of the image, allowing larger atoms to be learned (as opposed to small patches). The conducted experiments show that, when $K$ is large (say $K=10$), we are able to approximate with a very good accuracy many atoms such as apodized modified discrete cosines, curvelets, sinc functions or cosines. We also argue empirically that surprisingly the algorithm generally converges to a global minimum for large values of $K$ and $S$.
Keywords :
Type de document :
Article dans une revue
International Journal of Computer Vision, Springer Verlag, 2015, 114 (2), pp.195-216
Domaine :

Littérature citée [39 références]

https://hal.archives-ouvertes.fr/hal-00862903
Contributeur : Francois Malgouyres <>
Soumis le : mercredi 16 juillet 2014 - 10:45:30
Dernière modification le : mercredi 28 février 2018 - 10:22:54
Document(s) archivé(s) le : lundi 24 novembre 2014 - 15:50:38

### Fichier

FTL.pdf
Fichiers éditeurs autorisés sur une archive ouverte

### Identifiants

• HAL Id : hal-00862903, version 3

### Citation

Olivier Chabiron, Francois Malgouyres, Jean-Yves Tourneret, Nicolas Dobigeon. Toward Fast Transform Learning. International Journal of Computer Vision, Springer Verlag, 2015, 114 (2), pp.195-216. 〈hal-00862903v3〉

### Métriques

Consultations de la notice

## 258

Téléchargements de fichiers