Nonsmooth spectral gradient methods for unconstrained optimization
To solve nonsmooth unconstrained minimization problems, we combine the spectral choice of step length with two well-established subdifferential-type schemes: the gradient sampling method and the simplex gradient method. We focus on the interesting case in which the objective function is continuously...
Guardado en:
Autores principales: | , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Elsevier
2017
|
Materias: | |
Acceso en línea: | https://doaj.org/article/09769c1ca2fc49daa0950264b364bf55 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Sumario: | To solve nonsmooth unconstrained minimization problems, we combine the spectral choice of step length with two well-established subdifferential-type schemes: the gradient sampling method and the simplex gradient method. We focus on the interesting case in which the objective function is continuously differentiable almost everywhere, and it is often not differentiable at minimizers. In the case of the gradient sampling method, we also present a simple differentiability test that allows us to use the exact gradient direction as frequently as possible, and to build a stochastic subdifferential direction only if the test fails. The proposed spectral gradient sampling method is combined with a monotone line search globalization strategy. On the other hand, the simplex gradient method is a direct search method that only requires function evaluations to build an approximation to the gradient direction. In this case, the proposed spectral simplex gradient method is combined with a suitable nonmonotone line search strategy. For both scenarios, we discuss convergence properties and present numerical results on a set of nonsmooth test functions. These numerical results indicate that using a spectral step length can improve the practical performance of both methods. |
---|