EURO J Comput Optim
DOI 10.1007/s13675-017-0080-8
ORIGINAL PAPER
Nonsmooth spectral gradient methods for
unconstrained optimization
Milagros Loreto
1
· Hugo Aponte
2
·
Debora Cores
3
· Marcos Raydan
3
Received: 21 November 2015 / Accepted: 12 January 2017
© EURO - The Association of European Operational Research Societies 2017
Abstract To solve nonsmooth unconstrained minimization problems, we combine the
spectral choice of step length with two well-established subdifferential-type schemes:
the gradient sampling method and the simplex gradient method. We focus on the
interesting case in which the objective function is continuously differentiable almost
everywhere, and it is often not differentiable at minimizers. In the case of the gradient
sampling method, we also present a simple differentiability test that allows us to use
the exact gradient direction as frequently as possible, and to build a stochastic sub-
differential direction only if the test fails. The proposed spectral gradient sampling
method is combined with a monotone line search globalization strategy. On the other
hand, the simplex gradient method is a direct search method that only requires func-
Hugo Aponte: The work here presented is not affiliated or endorsed by Microsoft.
Debora Cores: Partially supported by CESMA at USB.
Marcos Raydan: Partially supported by CCCT Center at UCV.
B Milagros Loreto
mloreto@uw.edu
Hugo Aponte
huaponte@microsoft.com
Debora Cores
cores@usb.ve
Marcos Raydan
mraydan@usb.ve
1
School of Science, Technology, Engineering, and Mathematics (STEM), University of
Washington Bothell, 18115 Campus Way NE, Bothell, WA 90011-8246, USA
2
Microsoft, Redmond, WA, USA
3
Departamento de Cómputo Científico y Estadística, Universidad Simón Bolívar, Ap. 89000,
Caracas 1080, Venezuela
123