Worst case complexity of direct search

In this paper, we prove that the broad class of direct-search methods of directional type based on imposing sufficient decrease to accept new iterates shares the worst case complexity bound of steepest descent for the unconstrained minimization of a smooth function, more precisely that the number of...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autor principal: L.N. Vicente
Formato: article
Lenguaje:EN
Publicado: Elsevier 2013
Materias:
Acceso en línea:https://doaj.org/article/7bc2aa3f370a40d2b35db0657f4372e6
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:7bc2aa3f370a40d2b35db0657f4372e6
record_format dspace
spelling oai:doaj.org-article:7bc2aa3f370a40d2b35db0657f4372e62021-12-02T05:00:36ZWorst case complexity of direct search2192-440610.1007/s13675-012-0003-7https://doaj.org/article/7bc2aa3f370a40d2b35db0657f4372e62013-05-01T00:00:00Zhttp://www.sciencedirect.com/science/article/pii/S2192440621000150https://doaj.org/toc/2192-4406In this paper, we prove that the broad class of direct-search methods of directional type based on imposing sufficient decrease to accept new iterates shares the worst case complexity bound of steepest descent for the unconstrained minimization of a smooth function, more precisely that the number of iterations needed to reduce the norm of the gradient of the objective function below a certain threshold is at most proportional to the inverse of the threshold squared. In direct-search methods, the objective function is evaluated, at each iteration, at a finite number of points. No derivatives are required. The action of declaring an iteration successful (moving into a point of lower objective function value) or unsuccessful (staying at the same iterate) is based on objective function value comparisons. Some of these methods are directional in the sense of moving along predefined directions along which the objective function will eventually decrease for sufficiently small step sizes. The worst case complexity bounds derived measure the maximum number of iterations as well as the maximum number of objective function evaluations required to find a point with a required norm of the gradient of the objective function, and are proved for such directional direct-search methods when a sufficient decrease condition based on the size of the steps is imposed to accept new iterates.L.N. VicenteElsevierarticleDerivative-free optimizationDirect searchWorst case complexitySufficient decreaseApplied mathematics. Quantitative methodsT57-57.97Electronic computers. Computer scienceQA75.5-76.95ENEURO Journal on Computational Optimization, Vol 1, Iss 1, Pp 143-153 (2013)
institution DOAJ
collection DOAJ
language EN
topic Derivative-free optimization
Direct search
Worst case complexity
Sufficient decrease
Applied mathematics. Quantitative methods
T57-57.97
Electronic computers. Computer science
QA75.5-76.95
spellingShingle Derivative-free optimization
Direct search
Worst case complexity
Sufficient decrease
Applied mathematics. Quantitative methods
T57-57.97
Electronic computers. Computer science
QA75.5-76.95
L.N. Vicente
Worst case complexity of direct search
description In this paper, we prove that the broad class of direct-search methods of directional type based on imposing sufficient decrease to accept new iterates shares the worst case complexity bound of steepest descent for the unconstrained minimization of a smooth function, more precisely that the number of iterations needed to reduce the norm of the gradient of the objective function below a certain threshold is at most proportional to the inverse of the threshold squared. In direct-search methods, the objective function is evaluated, at each iteration, at a finite number of points. No derivatives are required. The action of declaring an iteration successful (moving into a point of lower objective function value) or unsuccessful (staying at the same iterate) is based on objective function value comparisons. Some of these methods are directional in the sense of moving along predefined directions along which the objective function will eventually decrease for sufficiently small step sizes. The worst case complexity bounds derived measure the maximum number of iterations as well as the maximum number of objective function evaluations required to find a point with a required norm of the gradient of the objective function, and are proved for such directional direct-search methods when a sufficient decrease condition based on the size of the steps is imposed to accept new iterates.
format article
author L.N. Vicente
author_facet L.N. Vicente
author_sort L.N. Vicente
title Worst case complexity of direct search
title_short Worst case complexity of direct search
title_full Worst case complexity of direct search
title_fullStr Worst case complexity of direct search
title_full_unstemmed Worst case complexity of direct search
title_sort worst case complexity of direct search
publisher Elsevier
publishDate 2013
url https://doaj.org/article/7bc2aa3f370a40d2b35db0657f4372e6
work_keys_str_mv AT lnvicente worstcasecomplexityofdirectsearch
_version_ 1718400837394366464