Open Access
Review
Issue
RAIRO-Oper. Res.
Volume 57, Number 1, January-February 2023
Page(s) 43 - 58
DOI https://doi.org/10.1051/ro/2022213
Published online 12 January 2023
  • A.B. Abubakar and P. Kumam, A descent Dai-Liao conjugate gradient method for nonlinear equations. Numer. Algorithms 81 (2019) 197–210. [Google Scholar]
  • Z. Aminifard and S. Babaie-Kafaki, Matrix analysis on the Dai-Liao conjugate gradient method. ANZIAM J. 61 (2019) 195–203. [MathSciNet] [Google Scholar]
  • Z. Aminifard and S. Babaie-Kafaki, An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix. 4OR 17 (2019) 317–330. [CrossRef] [MathSciNet] [Google Scholar]
  • Z. Aminifard and S. Babaie-Kafaki, A restart scheme for the Dai-Liao conjugate gradient method by ignoring a direction of maximum magnification by the search direction matrix. RAIRO:RO 54 (2020) 981–991. [CrossRef] [EDP Sciences] [Google Scholar]
  • Z. Aminifard and S. Babaie-Kafaki, An adaptive descent extension of the Polak–Rebière–Polyak conjugate gradient method based on the concept of maximum magnification. Iran. J. Numer. Anal. Optim. 11 (2021) 211–219. [Google Scholar]
  • Z. Aminifard and S. Babaie-Kafaki, Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing. Numer. Algorithms 89 (2022) 1369–1387. [CrossRef] [MathSciNet] [Google Scholar]
  • Z. Aminifard, A. Hosseini and S. Babaie-Kafaki, Modified conjugate gradient method for solving sparse recovery problem with nonconvex penalty. Signal Process. 193 (2022) 108424. [CrossRef] [Google Scholar]
  • N. Andrei, Numerical comparison of conjugate gradient algorithms for unconstrained optimization. Stud. Inform. Control 16 (2007) 333–352. [Google Scholar]
  • N. Andrei, New accelerated conjugate gradient algorithms as a modification of Dai–Yuan’s computational scheme for unconstrained optimization. J. Comput. Appl. Math. 234 (2010) 3397–3410. [CrossRef] [MathSciNet] [Google Scholar]
  • N. Andrei, Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization. Bull. Malays. Math. Sci. Soc. 34 (2011) 319–330. [MathSciNet] [Google Scholar]
  • N. Andrei, An adaptive conjugate gradient algorithm for large scale unconstrained optimization. J. Comput. Appl. Math. 292 (2016) 83–91. [CrossRef] [MathSciNet] [Google Scholar]
  • N. Andrei, A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues. Numer. Algorithms 77 (2018) 1273–1282. [Google Scholar]
  • M.R. Arazm, S. Babaie-Kafaki and R. Ghanbari, An extended Dai-Liao conjugate gradient method with global convergence for nonconvex functions. Glas. Mat. 52 (2017) 361–375. [CrossRef] [MathSciNet] [Google Scholar]
  • S. Babaie-Kafaki, An adaptive conjugacy condition and related nonlinear conjugate gradient methods. Int. J. Comput. Methods 11 (2014) 1350092. [CrossRef] [MathSciNet] [Google Scholar]
  • S. Babaie-Kafaki, On the sufficient descent condition of the Hager-Zhang conjugate gradient methods. 4OR 12 (2014) 285–292. [CrossRef] [MathSciNet] [Google Scholar]
  • S. Babaie-Kafaki, Computational approaches in large scale unconstrained optimization, in Big Data Optimization: Recent Developments and Challenges. Springer (2016) 391–417. [CrossRef] [Google Scholar]
  • S. Babaie-Kafaki, On optimality of two adaptive choices for the parameter of Dai-Liao method. Optim. Lett. 10 (2016) 1789–1797. [CrossRef] [MathSciNet] [Google Scholar]
  • S. Babaie-Kafaki and R. Ghanbari, A descent extension of the Polak–Ribière–Polyak conjugate gradient method. Comput. Math. Appl. 68 (2014) 2005–2011. [CrossRef] [MathSciNet] [Google Scholar]
  • S. Babaie-Kafaki and R. Ghanbari, The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234 (2014) 625–630. [Google Scholar]
  • S. Babaie-Kafaki and R. Ghanbari, A descent family of Dai-Liao conjugate gradient methods. Optim. Methods Softw. 29 (2014) 583–591. [Google Scholar]
  • S. Babaie-Kafaki and R. Ghanbari, Two modified three-term conjugate gradient methods with sufficient descent property. Optim. Lett. 8 (2014) 2285–2297. [CrossRef] [MathSciNet] [Google Scholar]
  • S. Babaie-Kafaki and R. Ghanbari, Two optimal Dai-Liao conjugate gradient methods. Optimization 64 (2015) 2277–2287. [Google Scholar]
  • S. Babaie-Kafaki and R. Ghanbari, An adaptive Hager-Zhang conjugate gradient method. Filomat 30 (2016) 3715–3723. [CrossRef] [MathSciNet] [Google Scholar]
  • S. Babaie-Kafaki, R. Ghanbari, Descent symmetrization of the Dai-Liao conjugate gradient method. Asia-Pac. J. Oper. Res. 33 (2016) 1650008. [CrossRef] [MathSciNet] [Google Scholar]
  • S. Babaie-Kafaki and R. Ghanbari, A class of adaptive Dai-Liao conjugate gradient methods based on the scaled memoryless BFGS update. 4OR 15 (2017) 85–92. [CrossRef] [MathSciNet] [Google Scholar]
  • S. Babaie-Kafaki and R. Ghanbari, A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update. J. Ind. Manag. Optim. 13 (2017) 649. [CrossRef] [MathSciNet] [Google Scholar]
  • S. Babaie-Kafaki and R. Ghanbari, An optimal extension of the Polak–Ribière–Polyak conjugate gradient method. Numer. Func. Anal. Optim. 38 (2017) 1115–1124. [CrossRef] [Google Scholar]
  • S. Babaie-Kafaki and R. Ghanbari, Two adaptive Dai-Liao nonlinear conjugate gradient methods. Iran. J. Sci. Technol.–Trans. A: Sci. 42 (2018) 1505–1509. [CrossRef] [MathSciNet] [Google Scholar]
  • S. Babaie-Kafaki and Z. Aminifard, Improving the Dai-Liao parameter choices using a fixed point equation. J. Math. Model. 10 (2022) 11–20. [MathSciNet] [Google Scholar]
  • S. Babaie-Kafaki, R. Ghanbari, Nezam Mahdavi-Amiri, Two new conjugate gradient methods based on modified secant equations. J. Comput. Appl. Math. 234 (2010) 1374–1386. [CrossRef] [MathSciNet] [Google Scholar]
  • S. Babaie-Kafaki, N. Mirhoseini and Z. Aminifard, A descent extension of a modified Polak–Ribière–Polyak method with application in image restoration problem. Optim. Lett. (2022). [Google Scholar]
  • F. Biglari, M.A. Hassan and W.J. Leong, New quasi-Newton methods via higher order tensor models. J. Comput. Appl. Math. 235 (2011) 2412–2422. [CrossRef] [MathSciNet] [Google Scholar]
  • A.M. Bruckstein, D.L. Donoho and M. Elad, From sparse solutions of systems of equations to sparse modeling of signals and images. SIAM Rev. 51 (2009) 34–81. [Google Scholar]
  • R.H. Byrd and J. Nocedal, A tool for the analysis of quasi-Newton methods with application to unconstrained minimization. SIAM J. Numer. Anal. 26 (1989) 727–739. [CrossRef] [MathSciNet] [Google Scholar]
  • R.H. Chan, C-Wa Ho and M. Nikolova, Salt-and-pepper noise removal by median type noise detectors and detail-preserving regularization. IEEE Trans. Image Process. 14 (2005) 1479–1485. [CrossRef] [PubMed] [Google Scholar]
  • P.S. Chang and A.N. Willson, Analysis of conjugate gradient algorithms for adaptive filtering. IEEE Trans. Signal Process. 48 (2000) 409–418. [CrossRef] [MathSciNet] [Google Scholar]
  • Y. Cheng, Q. Mou, X. Pan and S. Yao, A sufficient descent conjugate gradient method and its global convergence. Optim. Methods Softw. 31 (2016) 577–590. [CrossRef] [MathSciNet] [Google Scholar]
  • Y.-H. Dai, New properties of a nonlinear conjugate gradient method. Numer. Math. 89 (2001) 83–98. [CrossRef] [MathSciNet] [Google Scholar]
  • Y.-H. Dai and Y.-X. Yuan, A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10 (1999) 177–182. [CrossRef] [MathSciNet] [Google Scholar]
  • Y.-H. Dai and L.-Z. Liao, New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43 (2001) 87–101. [Google Scholar]
  • Z. Dai and F. Wen, Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property. Appl. Math. Comput. 218 (2012) 7421–7430. [MathSciNet] [Google Scholar]
  • Y.-H. Dai and C.-X. Kou, A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23 (2013) 296–320. [CrossRef] [MathSciNet] [Google Scholar]
  • Y. Dai, J. Han, G. Liu, D. Sun, H. Yin and Y.-X. Yuan, Convergence properties of nonlinear conjugate gradient methods. SIAM J. Optim. 10 (2000) 345–358. [CrossRef] [MathSciNet] [Google Scholar]
  • R. Dehghani and N. Bidabadi, Two-step conjugate gradient method for unconstrained optimization. Comput. Appl. Math. 39 (2020) 1–15. [Google Scholar]
  • R. Dehghani, N. Bidabadi, H. Fahs and M.M. Hosseini, A conjugate gradient method based on a modified secant relation for unconstrained optimization. Numer. Funct. Anal. Optim. 41 (2020) 621–634. [CrossRef] [MathSciNet] [Google Scholar]
  • H. Esmaeili, S. Shabani and M. Kimiaei, A new generalized shrinkage conjugate gradient method for sparse recovery. Calcolo 56 (2019) 1–38. [CrossRef] [Google Scholar]
  • P. Faramarzi and K. Amini, A modified conjugate gradient method based on a modified secant equation. Appl. Math. Model. 8 (2020) 1–20. [Google Scholar]
  • M. Fatemi, A new efficient conjugate gradient method for unconstrained optimization. J. Comput. Appl. Math. 300 (2016) 207–216. [Google Scholar]
  • M. Fatemi, An optimal parameter for Dai-Liao family of conjugate gradient methods. J. Optim. Theory Appl. 169 (2016) 587–605. [CrossRef] [MathSciNet] [Google Scholar]
  • M. Fatemi and S. Babaie-Kafaki, Two extensions of the Dai-Liao method with sufficient desent property based on a penalization scheme. Bull. Comput. Appl. Math. 4 (2016) 7–19. [MathSciNet] [Google Scholar]
  • R. Fletcher and C.M. Reeves, Function minimization by conjugate gradients. Comput. J. 7 (1964) 149–154. [Google Scholar]
  • J.A. Ford and I. Moghrabi, Multi-step quasi-Newton methods for optimization. J. Comput. Appl. Math. 50 (1994) 305–323. [CrossRef] [MathSciNet] [Google Scholar]
  • J.A. Ford, Y. Narushima and H. Yabe, Multi-step nonlinear conjugate gradient methods for unconstrained minimization. Comput. Optim. Appl. 40 (2008) 191–216. [CrossRef] [MathSciNet] [Google Scholar]
  • J.C. Gilbert and J. Nocedal, Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2 (1992) 21–42. [Google Scholar]
  • W.W. Hager and H. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16 (2005) 170–192. [Google Scholar]
  • W.W. Hager and H. Zhang, Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32 (2006) 113–137. [Google Scholar]
  • W.W. Hager and H. Zhang, A survey of nonlinear conjugate gradient methods. Pacific J. Optim. 2 (2006) 35–58. [Google Scholar]
  • A.S. Halilu, A. Majumder, M.Y. Waziri, K. Ahmed and A. Muhammed Awwal, Motion control of the two joint planar robotic manipulators through accelerated Dai-Liao method for solving system of nonlinear equations. Eng. Comput. 39 (2022) 1802–1840. [CrossRef] [Google Scholar]
  • M.R. Hestenes and E. Stiefel, Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49 (1952) 409. [CrossRef] [Google Scholar]
  • Y. Huang and C. Liu, Dai-Kou type conjugate gradient methods with a line search only using gradient. J. Inequal. Appl. 2017 (2017) 1–17. [CrossRef] [Google Scholar]
  • N.A. Iusem and V.M. Solodov, Newton-type methods with generalized distances for constrained optimization. optimization 41 (1997) 257–278. [CrossRef] [MathSciNet] [Google Scholar]
  • B. Ivanov, G.V. Milovanović and P.S. Stanimirović, Accelerated Dai-Liao projection method for solving systems of monotone nonlinear equations with application to image deblurring. J. Glob. Optim. (2022). [Google Scholar]
  • J. Jian, L. Han and X. Jiang, A hybrid conjugate gradient method with descent property for unconstrained optimization. Appl. Math. Model. 39 (2015) 1281–1290. [CrossRef] [MathSciNet] [Google Scholar]
  • M. Kobayashi, Y. Narushima and H. Yabe, Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems. J. Comput. Appl. Math. 234 (2010) 375–397. [CrossRef] [MathSciNet] [Google Scholar]
  • C. Kou, An improved nonlinear conjugate gradient method with an optimal property. Sci. China Math. 57 (2014) 635–648. [CrossRef] [MathSciNet] [Google Scholar]
  • C.-H. Lee, B.D. Rao and H. Garudadri, A sparse conjugate gradient adaptive filter. IEEE Signal Process. Lett. 27 (2020) 1000–1004. [CrossRef] [PubMed] [Google Scholar]
  • D.-H. Li and M. Fukushima, A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129 (2001) 15–35. [CrossRef] [MathSciNet] [Google Scholar]
  • D.-H. Li and M. Fukushima, On the global convergence of the BFGS method for nonconvex unconstrained optimization problems. SIAM J. Optim. 11 (2001) 1054–1064. [CrossRef] [MathSciNet] [Google Scholar]
  • S. Li and Z. Huang, Guaranteed descent conjugate gradient methods with modified secant condition. J. Ind. Manag. Optim. 4 (2008) 739. [CrossRef] [MathSciNet] [Google Scholar]
  • G. Li, C. Tang and Z. Wei, New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comput. Appl. Math. 202 (2007) 523–539. [Google Scholar]
  • X. Li, W. Zhang and X. Dong, A class of modified FR conjugate gradient method and applications to nonnegative matrix factorization. Comput. Math. Appl. 73 (2017) 270–276. [CrossRef] [MathSciNet] [Google Scholar]
  • M. Li, H. Liu, Z. Liu, A new family of conjugate gradient methods for unconstrained optimization. J. Appl. Math. Comput. 58 (2018) 219–234. [CrossRef] [MathSciNet] [Google Scholar]
  • X. Li, W. Zhao and X. Dong, A new CG algorithm based on a scaled memoryless BFGS update with adaptive search strategy, and its application to large scale unconstrained optimization problems. J. Comput. Appl. Math. 398 (2021) 113670. [CrossRef] [Google Scholar]
  • Y. Liu and C. Storey, Efficient generalized conjugate gradient algorithms, part 1: theory. J. Optim. Theory Appl. 69 (1991) 129–137. [Google Scholar]
  • H. Liu, H. Wang and Q. Ni, On Hager and Zhang’s conjugate gradient method with guaranteed descent. Appl. Math. Comput. 236 (2014) 400–407. [MathSciNet] [Google Scholar]
  • H. Liu, Y. Yao, X. Qian and H. Wang, Some nonlinear conjugate gradient methods based on spectral scaling secant equations. Comput. Appl. Math. 35 (2016) 639–651. [CrossRef] [MathSciNet] [Google Scholar]
  • Z. Liu, H. Liu and Y.-H. Dai, An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization. Comput. Optim. Appl. 75 (2020) 145–167. [CrossRef] [MathSciNet] [Google Scholar]
  • M. Lotfi and S.M. Hosseini, An efficient Dai-Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation. J. Comput. Appl. Math. 371 (2020) 112708. [CrossRef] [MathSciNet] [Google Scholar]
  • J. Lu, Y. Li and H. Pham, A modified Dai-Liao conjugate gradient method with a new parameter for solving image restoration problems. Math. Probl. Eng. 2020 (2020) 6279543. [Google Scholar]
  • J. Lu, G. Yuan and Z. Wang, A modified Dai-Liao conjugate gradient method for solving unconstrained optimization and image restoration problems. J. Appl. Math. Comput. 68 (2022) 681–703. [CrossRef] [MathSciNet] [Google Scholar]
  • M. Momeni and M.R. Peyghami, A new conjugate gradient algorithm with cubic Barzilai-Borwein stepsize for unconstrained optimization. Optim. Methods Softw. 34 (2019) 650–664. [CrossRef] [MathSciNet] [Google Scholar]
  • W. Nakamura, Y. Narushima and H. Yabe, Nonlinear conjugate gradient methods with sufficient descent properties for unconstrained optimization. J. Ind. Manag. Optim. 9 (2013) 595–619. [CrossRef] [MathSciNet] [Google Scholar]
  • Y. Narushima and H. Yabe, A survey of sufficient descent conjugate gradient methods for unconstrained optimization. SUT J. Math. 50 (2014) 167–203. [CrossRef] [MathSciNet] [Google Scholar]
  • Y. Narushima, H. Yabe and J.A. Ford, A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21 (2011) 212–230. [MathSciNet] [Google Scholar]
  • Y. Nesterov, Smooth minimization of nonsmooth functions. Math. Program. 103 (2005) 127–152. [CrossRef] [MathSciNet] [Google Scholar]
  • J. Nocedal and S. Wright, Numerical Optimization. Springer, New York (2006). [Google Scholar]
  • S.S. Oren and D.G. Luenberger, Self-scaling variable metric (SSVM) algorithms: Part I: Criteria and sufficient conditions for scaling a class of algorithms. Manag. Sci. 20 (1974) 845–862. [CrossRef] [Google Scholar]
  • S.S. Oren and E. Spedicato, Optimal conditioning of self-scaling variable metric algorithms. Math. Program. 10 (1976) 70–90. [Google Scholar]
  • M.R. Peyghami, H. Ahmadzadeh and A. Fazli, A new class of efficient and globally convergent conjugate gradient methods in the Dai-Liao family. Optim. Methods Softw. 30 (2015) 843–863. [CrossRef] [MathSciNet] [Google Scholar]
  • G. Piazza and T. Politi, An upper bound for the condition number of a matrix in spectral norm. J. Comput. Appl. Math. 143 (2002) 141–144. [CrossRef] [MathSciNet] [Google Scholar]
  • B.T. Polyak, The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9 (1969) 94–112. [CrossRef] [Google Scholar]
  • E. Polak and G. Ribière, Note sur la convergence de méthodes de directions conjuguées. ESAIM. Math. Model. Numer. Anal. 3 (1969) 35–43. [Google Scholar]
  • D.F. Shanno, Conjugate gradient methods with inexact searches. Math. Oper. Res. 3 (1978) 244–256. [CrossRef] [MathSciNet] [Google Scholar]
  • Y. Shengwei, Z. Wei and H. Huang, A note about WYL’s conjugate gradient method and its applications. Appl. Math. Comput. 191 (2007) 381–388. [MathSciNet] [Google Scholar]
  • P.S. Stanimirović, B. Ivanov, H. Ma, D. Mosić, A survey of gradient methods for solving nonlinear optimization. Electron. Res. Arch. 28 (2020) 1573. [CrossRef] [MathSciNet] [Google Scholar]
  • K. Sugiki, Y. Narushima and H. Yabe, Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153 (2012) 733–757. [CrossRef] [MathSciNet] [Google Scholar]
  • W. Sun and Y.-X. Yuan, Optimization Theory and Methods: Nonlinear Programming. Springer, New York (2006). [Google Scholar]
  • M. Sun and J. Liu, Three modified Polak–Ribière–Polyak conjugate gradient methods with sufficient descent property. J. Inequal. Appl. 2015 (2015) 1–14. [Google Scholar]
  • D.S. Watkins, Fundamentals of Matrix Computations. John Wiley & Sons (2004). [Google Scholar]
  • M.Y. Waziri, K. Ahmed and J. Sabi’u, A family of Hager-Zhang conjugate gradient methods for system of monotone nonlinear equations. Appl. Math. Comput. 361 (2019) 645–660. [MathSciNet] [Google Scholar]
  • M.Y. Waziri, K. Ahmed and J. Sabi’u, A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations. Arab. J. Math. 9 (2020) 443–457. [Google Scholar]
  • M.Y. Waziri, K. Ahmed, J. Sabi’u and A.S. Halilu, Enhanced Dai-Liao conjugate gradient methods for systems of monotone nonlinear equations. SEMA J. 78 (2021) 15–51. [CrossRef] [MathSciNet] [Google Scholar]
  • M.Y. Waziri, K. Ahmed, A.S. Halilu and J. Sabi’u, Two new Hager-Zhang iterative schemes with improved parameter choices for monotone nonlinear systems and their applications in compressed sensing. RAIRO:RO 56 (2022) 239–273. [CrossRef] [EDP Sciences] [Google Scholar]
  • Z. Wei, G. Li and L. Qi, New quasi-Newton methods for unconstrained optimization problems. Appl. Math. Comput. 175 (2006) 1156–1188. [MathSciNet] [Google Scholar]
  • K. Xiong, H.H.-C. Iu and S. Wang, Kernel correntropy conjugate gradient algorithms based on half-quadratic optimization. IEEE Trans. Cybern. 51 (2020) 5497–5510. [Google Scholar]
  • C. Xu and J. Zhang, A survey of quasi-Newton equations and quasi-Newton methods for optimization. Ann. Oper. Res. 103 (2001) 213–234. [CrossRef] [MathSciNet] [Google Scholar]
  • H. Yabe and M. Takano, Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput. Optim. Appl. 28 (2004) 203–225. [MathSciNet] [Google Scholar]
  • S. Yao, X. Lu and Z. Wei, A conjugate gradient method with global convergence for large scale unconstrained optimization problems. J. Appl. Math. 2013 (2013) 730454. [Google Scholar]
  • S. Yao, Q. Feng, L. Li and J. Xu, A class of globally convergent three-term Dai-Liao conjugate gradient methods. Appl. Numer. Math. 151 (2020) 354–366. [CrossRef] [MathSciNet] [Google Scholar]
  • G. Yu, L. Guan and G. Li, Global convergence of modified Polak–Ribière–Polyak conjugate gradient methods with sufficient descent property. J. Ind. Manag. Optim. 4 (2008) 565–579. [CrossRef] [MathSciNet] [Google Scholar]
  • G. Yu, J. Huang and Y. Zhou, A descent spectral conjugate gradient method for impulse noise removal. Appl. Math. Lett. 23 (2010) 555–560. [CrossRef] [MathSciNet] [Google Scholar]
  • Y.-X. Yuan, A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11 (1991) 325–332. [CrossRef] [MathSciNet] [Google Scholar]
  • G. Yuan, Modified nonlinear conjugate gradient methods with sufficient descent property for large scale optimization problems. Optim. Lett. 3 (2009) 11–21. [CrossRef] [MathSciNet] [Google Scholar]
  • G. Yuan and M. Zhang, A three-terms Polak–Ribière–Polyak conjugate gradient algorithm for large scale nonlinear equations. J. Comput. Appl. Math. 286 (2015) 186–195. [CrossRef] [MathSciNet] [Google Scholar]
  • G. Yuan, Z. Wei and Q. Zhao, A modified Polak–Ribière–Polyak conjugate gradient algorithm for large scale optimization problems. IIE Trans. 46 (2014) 397–413. [CrossRef] [Google Scholar]
  • G. Yuan, T. Li and W. Hu, A conjugate gradient algorithm for large scale nonlinear equations and image restoration problems. Appl. Numer. Math. 147 (2020) 129–141. [CrossRef] [MathSciNet] [Google Scholar]
  • G. Yuan, J. Lu and Z. Wang, The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems. Appl. Numer. Math. 152 (2020) 1–11. [CrossRef] [MathSciNet] [Google Scholar]
  • G. Yuan, J. Lu and Z. Wang, The modified PRP conjugate gradient algorithm under a nondescent line search and its application in the Muskingum model and image restoration problems. Soft Comput. 25 (2021) 5867–5879. [CrossRef] [Google Scholar]
  • J. Zhang and C. Xu, Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations. J. Comput. Appl. Math. 137 (2001) 269–278. [Google Scholar]
  • J. Zhang, N. Deng and L. Chen, New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102 (1999) 147–167. [CrossRef] [MathSciNet] [Google Scholar]
  • L. Zhang, W. Zhou and D.-H. Li, A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26 (2006) 629–640. [CrossRef] [MathSciNet] [Google Scholar]
  • L. Zhang, W. Zhou and D. Li, Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo type line search. Numer. Math. 104 (2006) 561–572. [CrossRef] [MathSciNet] [Google Scholar]
  • L. Zhang, W. Zhou and D. Li, Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22 (2007) 697–711. [CrossRef] [MathSciNet] [Google Scholar]
  • K. Zhang, H. Liu, Z. Liu, A new Dai-Liao conjugate gradient method with optimal parameter choice. Numer. Funct. Anal. Optim. 40 (2019) 194–215. [CrossRef] [MathSciNet] [Google Scholar]
  • Y. Zheng, A new family of Dai-Liao conjugate gradient methods with modified secant equation for unconstrained optimization. RAIRO:RO 55 (2021) 3281–3291. [CrossRef] [EDP Sciences] [Google Scholar]
  • Y. Zheng and B. Zheng, Two new Dai-Liao type conjugate gradient methods for unconstrained optimization problems. J. Optim. Theory Appl. 175 (2017) 502–509. [CrossRef] [MathSciNet] [Google Scholar]
  • W. Zhou and L. Zhang, A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. 21 (2006) 707–714. [Google Scholar]
  • H. Zhu, Y. Xiao and S.-Y. Wu, Large sparse signal recovery by conjugate gradient algorithm based on smoothing technique. Comput. Math. Appl. 66 (2013) 24–32. [CrossRef] [MathSciNet] [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.