In collaboration with Payame Noor University and the Iranian Society of Instrumentation and Control Engineers

Document Type : Research Article

Authors

1 Department of Applied Mathematics, Payame Noor University, Tehran 193953697, Iran

2 Department of Applied Mathematics, Payame Noor University, Tehran, 193953697, Iran

Abstract

Following the setting of the Dai-Liao (DL) parameter in conjugate gradient (CG) methods‎, ‎we introduce two new parameters based on the modified secant equation proposed by Li et al‎. ‎(Comput‎. ‎Optim‎. ‎Appl‎. ‎202:523-539‎, ‎2007) with two approaches‎, ‎which use an extended new conjugacy condition‎. ‎The first is based on a modified descent three-term search direction‎, ‎as the descent Hestenes-Stiefel CG method‎. ‎The second is based on the quasi-Newton (QN) approach‎. ‎Global convergence of the proposed methods for uniformly convex functions and general functions is proved‎. ‎Numerical experiments are done on a set of test functions of the CUTEr collection and the results are compared with some well-known methods.

Keywords

[1] Andrei N. (2008). ”An unconstrained optimization test functions collection”, Adv. Model. Optim, 10, 147-161.
[2] Andrei N. (2011). ”Open problem in conjugate gradient algorithms for unconstrained optimization”, Bull Malays Math Sci Soc, 34, 319-330.
[3] Perry A. (1978). ”A modified conjugate gradient algorithm”, Operations Research, 26, 1073-1078.
[4] Polyak B. T. (1969). ”The conjugate gradient method in extremal problems”, USSR Computational Mathematics and Mathematical Physics, 9, 94-112.
[5] Shanno D. F. (1978). ”Conjugate gradient methods with inexact searches”, Mathematics of operations research, 3, 244-256.
[6] Li D. H., Fukushima M. (2001). ”A modified BFGS method and its global convergence in nonconvex minimization”, Journal of Computational and Applied Mathematics, 129, 15-35.
[7] Li D. H., Fukushima M. (2001). ”On the global convergence of the BFGS method for nonconvex unconstrained optimization problems”, SIAM Journal on Optimization, 11, 1054-1064.
[8] Dolan E. D., Moré J. J. (2002). ”Benchmarking optimization software with performance profiles”, Mathematical programming, 91, 201-213.
[9] Polak E., Ribiere G. (1969). ”Note sur la convergence de méthodes de directions conjuguées”, ESAIM: Mathematical Modelling and Numerical Analysis-Modélisation Mathé-matique et Analyse Numérique, 3, 35-43.
[10] Li G., Tang C., Wei Z. (2007). ”New conjugacy condition and related new conjugate gradient methods for unconstrained optimization”, Journal of Computational and Applied Mathematics, 202, 523-539.
[11] Yabe H., Takano M. (2004). ”Global convergence properties of nonlinear conjugate gradient methods with modified secant condition”, Computational Optimization and Applications, 28, 203-225.
[12] Livieris I. E., Pintelas P. (2013). ”A new class of spectral conjugate gradient methods based on a modified secant equation for unconstrained optimization”, Journal of Computational and Applied Mathematics, 239, 396-405.
[13] Zhang J. Z., Deng N. Y., Chen L. H. (1999). ”New quasi-Newton equation and related methods for unconstrained optimization”, Journal of Optimization Theory and Applications, 102, 147-167.
[14] Zhang J., Xu, C. (2001). ”Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations”, Journal of Computational and Applied Mathematics, 137, 269-278.
[15] Sugiki K., Narushima Y., Yabe H. (2012). ”Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization”, Journal of Optimization Theory and Applications, 153, 733-757.
[16] Zhang K., Liu H., Liu Z. (2019). ”A New Dai-Liao Conjugate Gradient Method with Optimal Parameter Choice”, Numerical Functional Analysis and Optimization, 40, 194- 215.
[17] Zhou W., Zhang L. (2006). ”A nonlinear conjugate gradient method based on the MBFGS secant condition”, Optimization Methods and Software, 21, 707-714.
[18] Zhang L., Zhou W., Li D. (2007). ”Some descent three-term conjugate gradient methods and their global convergence”, Optimisation Methods and Software, 22, 697-711.
[19] Powell M. J. (1984). ”Nonconvex minimization calculations and the conjugate gradient method”. In Numerical analysis, Springer, Berlin, Heidelberg, 122-141.
[20] Peyghami M. R., Ahmadzadeh H., Fazli A. (2015). ”A new class of effcient and globally convergent conjugate gradient methods in the Dai–Liao family”, Optimization Methods and Software, 30, 843-863.
[21] Hestenes M. R., Stiefel E. (1952). ”Methods of conjugate gradients for solving linear systems”, Journal of  research of the National Bureau of Standards, 49, 409-436.
[22] Wolfe P. (1969). ”Convergence conditions for ascent methods”, SIAM review, 11, 226-235.
[23] Fletcher R., Reeves C. M. (1964). ”Function minimization by conjugate gradients”. Thecomputer journal, 7, 149-154.
[24] Babaie-Kafaki S. (2016). ”On optimality of two adaptive choices for the parameter of Dai–Liao method”, Optimization Letters, 10, 1789-1797.
[25] Babaie-Kafaki S., Ghanbari R. (2014). ”A descent family of Dai–Liao conjugate gradient methods”, Optimization Methods and Software, 29, 583-591.
[26] Babaie-Kafaki S., Ghanbari R., Mahdavi-Amiri N. (2010). ”Two new conjugate gradient methods based on modified secant equations”, Journal of computational and applied Mathematics, 234, 1374-1386. 1374-1386.
[27] Babaie-Kafaki S., Ghanbari R. (2014). ”The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices”, European Journal of Operational Research, 234, 625-630.
[28] Babaie-Kafaki S., Ghanbari R. (2014). ”Two modified three-term conjugate gradient methods with sufficient descent property”, Optimization Letters, 8, 2285-2297.
[29] Babaie-Kafaki S., Ghanbari R. (2015). ”Two optimal Dai–Liao conjugate gradient methods”, Optimization, 64, 2277-2287.
[30] Sun W., Yuan Y. X. (2006). ”Optimization theory and methods: nonlinear programming”, Springer Science & Business Media.
[31] Hager W. W., Zhang H. (2005). ”A new conjugate gradient method with guaranteed descent and an efficient line search”, SIAM Journal on optimization, 16, 170-192.
[32] Hager W. W., Zhang H. (2006). ”A survey of nonlinear conjugate gradient methods”, Pacific journal of Optimization, 2, 35-58.
[33] Dai Y. H., Kou C. X. (2013). ”A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search”, SIAM Journal on Optimization, 23, 296-320.
[34] Dai Y., Han J., Liu G., Sun D., Yin H., Yuan Y. X. (2000). ”Convergence properties of nonlinear conjugate gradient methods”, SIAM Journal on Optimization, 10, 345-358.
[35] Dai Y. H., Liao L. Z. (2001). ”New conjugacy conditions and related nonlinear conjugate gradient methods”, Applied Mathematics and Optimization, 43, 87-101.
[36] Dai Y. H., Yuan Y. (1999). ”A nonlinear conjugate gradient method with a strong global convergence property”, SIAM Journal on optimization, 10, 177-182.
[37] Narushima Y., Yabe H., Ford J. A. (2011). ”A three-term conjugate gradient method with sufficient descent property for unconstrained optimization”, SIAM Journal on Optimization, 21, 212-230.
[38] Aminifard Z., Babaie-Kafaki S. (2019). ”An optimal parameter choice for the Dai–Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix”. 4OR, 17, 317-330.
[39] Wei Z., Li G., Qi L. (2006). ”New quasi-Newton methods for unconstrained optimization problems”, Applied Mathematics and Computation, 175, 1156-1188.
[40] Wei Z., Yu G., Yuan G., Lian Z. (2004). ”The super-linear convergence of a modified BFGS-type method for unconstrained optimization”, Computational optimization and applications, 29, 315-332.