Convergence of non-smooth descent methods using the Kurdyka-\L ojasiewicz inequality.
Résumé
We investigate convergence of subgradient-oriented descent methods in non-smooth non-convex optimization. We prove convergence in the sense of subsequences for functions with a strict standard model, and we show that convergence to a single critical point may be guaranteed, if the strong Kurdyka-Łojasiewicz condition is added. We show by way of an example that the Kurdyka-Łojasiewicz inequality alone is not sufficient to prove convergence to critical points.
Origine : Fichiers produits par l'(les) auteur(s)
Loading...