Numer Algor
https://doi.org/10.1007/s11075-017-0460-4
ORIGINAL PAPER
Hybridization of accelerated gradient descent method
Milena Petrovi´ c
1
· Vladimir Rakoˇ cevi´ c
2,3
·
Nataˇ sa Kontrec
1
· Stefan Pani´ c
1
· Dejan Ili´ c
3
Received: 7 December 2016 / Accepted: 11 December 2017
© Springer Science+Business Media, LLC, part of Springer Nature 2017
Abstract We present a gradient descent algorithm with a line search procedure for
solving unconstrained optimization problems which is defined as a result of applying
Picard-Mann hybrid iterative process on accelerated gradient descent SM method
described in Stanimirovi´ c and Miladinovi´ c (Numer. Algor. 54, 503–520, 2010).
Using merged features of both analyzed models, we show that new accelerated gradi-
ent descent model converges linearly and faster then the starting SM method which is
confirmed trough displayed numerical test results. Three main properties are tested:
number of iterations, CPU time and number of function evaluations. The efficiency of
the proposed iteration is examined for the several values of the correction parameter
introduced in Khan (2013).
Milena Petrovi´ c
milena.petrovic@pr.ac.rs
Vladimir Rakoˇ cevi´ c
vrakoc@sbb.rs
Nataˇ sa Kontrec
natasa.kontrec@pr.ac.rs
Stefan Pani´ c
stefanpnc@yahoo.com
Dejan Ili´ c
ilicde@ptt.rs
1
Faculty of Sciences and Mathematics, University of Priˇ stina, Lole Ribara 29, 29000 Kosovska
Mitrovica, Serbia
2
Serbian Academy of Sciences and Arts, Kneza Mihaila 35, 11000 Belgrade, Serbia
3
Faculty of Sciences and Mathematics, University of Niˇ s, Viˇ segradska 33, 18000 Niˇ s, Serbia