Abstract
In order to simultaneously benefit the computational merits of the Hestenes–Stiefel method and the worthwhile descent and convergence properties of the Dai–Yuan method, Andrei (Stud. Inform. Control 17, 55–70, 2008) introduced a hybrid conjugate gradient algorithm by convexly combining the parameters of the two methods. Here, on account of the advantages of the convex combinations of the conjugate gradient methods, three hybrid conjugate gradient algorithms are proposed by using the ellipsoid norm (as an extension of the Euclidean norm) in a least-squares framework. To determine the hybridization parameter of the given methods, quasi-Newton aspects are also employed, including the secant equation as well as the memoryless (inverse) Hessian updating formulas. The computational advantages of the given algorithms are depicted on a set of CUTEr test functions.