Two modified hybrid conjugate gradient methods based on a hybrid secant equation
Abstract
Taking advantage of the attractive features of Hestenes–Stiefel and Dai–Yuan conjugate gradient methods, we suggest two globally convergent hybridizations of these methods following Andrei's approach of hybridizing the conjugate gradient parameters convexly and Powell's approach of nonnegative restriction of the conjugate gradient parameters. In our methods, the hybridization parameter is obtained based on a recently proposed hybrid secant equation. Numerical results demonstrating the efficiency of the proposed methods are reported.
Keywords:
unconstrained optimization, large-scale optimization, conjugate gradient method, secant equation, global convergenceHow to Cite
Share
License
Copyright (c) 2013 The Author(s). Published by Vilnius Gediminas Technical University.
This work is licensed under a Creative Commons Attribution 4.0 International License.
View article in other formats
Published
Issue
Section
Copyright
Copyright (c) 2013 The Author(s). Published by Vilnius Gediminas Technical University.
License
This work is licensed under a Creative Commons Attribution 4.0 International License.