Two modified hybrid conjugate gradient methods based on a hybrid secant equation

    Saman Babaie-Kafaki Info
    Nezam Mahdavi-Amiri Info

Abstract

Taking advantage of the attractive features of Hestenes–Stiefel and Dai–Yuan conjugate gradient methods, we suggest two globally convergent hybridizations of these methods following Andrei's approach of hybridizing the conjugate gradient parameters convexly and Powell's approach of nonnegative restriction of the conjugate gradient parameters. In our methods, the hybridization parameter is obtained based on a recently proposed hybrid secant equation. Numerical results demonstrating the efficiency of the proposed methods are reported.

Keywords:

unconstrained optimization, large-scale optimization, conjugate gradient method, secant equation, global convergence

How to Cite

Babaie-Kafaki, S., & Mahdavi-Amiri, N. (2013). Two modified hybrid conjugate gradient methods based on a hybrid secant equation. Mathematical Modelling and Analysis, 18(1), 32-52. https://doi.org/10.3846/13926292.2013.756832

Share

Published in Issue
February 1, 2013
Abstract Views
641

View article in other formats

CrossMark check

CrossMark logo

Published

2013-02-01

Issue

Section

Articles

How to Cite

Babaie-Kafaki, S., & Mahdavi-Amiri, N. (2013). Two modified hybrid conjugate gradient methods based on a hybrid secant equation. Mathematical Modelling and Analysis, 18(1), 32-52. https://doi.org/10.3846/13926292.2013.756832

Share