Open Access
Issue
RAIRO-Oper. Res.
Volume 58, Number 1, January-February 2024
Page(s) 665 - 679
DOI https://doi.org/10.1051/ro/2023093
Published online 22 February 2024
  • F. Abdollahi and M. Fatemi, A new conjugate gradient method based on a modified secant condition with its applications in image processing. RAIRO: Oper. Res. 55 (2021) 167–187. [CrossRef] [EDP Sciences] [MathSciNet] [Google Scholar]
  • Z. Aminifard and S. Babaie-Kafaki, A modified descent Polak–Ribière–Polyak conjugate gradient method with global convergence property for nonconvex functions. Calcolo 56 (2019) 1–11. [CrossRef] [Google Scholar]
  • Z. Aminifard and S. Babaie-Kafaki, Dai–Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing. Numer. Algor. 89 (2022) 1369–1387. [CrossRef] [Google Scholar]
  • N. Andrei, Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204 (2010) 410–420. [CrossRef] [Google Scholar]
  • N. Andrei, Numerical comparison of conjugate gradient algorithms for unconstrained optimization. Stud. Inf. Control 16 (2007) 333–352. [Google Scholar]
  • N. Andrei, A Dai–Liao conjugate gradient algorithm with clustering of eigenvalues. Numer. Algorithms 77 (2018) 1273–1282. [CrossRef] [MathSciNet] [Google Scholar]
  • S. Babaie-Kafaki and R. Ghanbari, The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234 (2014) 625–630. [CrossRef] [Google Scholar]
  • S. Babaie-Kafaki and R. Ghanbari, A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach. Optim. Method. Softw. 30 (2015) 673–681. [CrossRef] [Google Scholar]
  • J. Barzilai and J.M. Borwein, Two-point step size gradient method. IMA. J. Numer. Anal. 8 (1998) 141–148. [Google Scholar]
  • E. Birgin and J.M. Martinez, A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43 (2001) 117–128. [CrossRef] [MathSciNet] [Google Scholar]
  • J. Cao and J. Wu, A conjugate gradient algorithm and its applications in image restoration. Appl. Numer. Math. 152 (2020) 243–252. [CrossRef] [MathSciNet] [Google Scholar]
  • Y.H. Dai and L.Z. Liao, New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43 (2001) 87–101. [Google Scholar]
  • Y.H. Dai and Y. Yuan, A nonlinear conjugate gradient method with a strong global convergence property. SIAM. J. Optim. 10 (1999) 177–182. [CrossRef] [MathSciNet] [Google Scholar]
  • S. Deng, Z. Wan and X. Chen, An improved spectral conjugate gradient algorithm for nonconvex unconstrained optimization problems. J. Optim. Theory. Appl. 157 (2013) 820–842. [CrossRef] [MathSciNet] [Google Scholar]
  • E. Dolan and J.J. Moré, Benchmarking optimization software with performance profiles. Math. Program. Ser. A 91 (2002) 201–213. [CrossRef] [Google Scholar]
  • M. Fatemi, A scaled conjugate gradient method for nonlinear unconstrained optimization. Optim. Method. Softw. 32 (2017) 1095–1112. [CrossRef] [Google Scholar]
  • R. Fletcher and C.M. Reeves, Function minimization by conjugate gradients. Comput. J. 7 (1964) 149–154. [Google Scholar]
  • J.C. Gilbert and J. Nocedal, Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2 (1992) 21–42. [Google Scholar]
  • N.I.M. Gould, D. Orban and P.L. Toint, CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization. Comput. Optim. Appl. 60 (2015) 545–557. [CrossRef] [MathSciNet] [Google Scholar]
  • W.W. Hager and H.A. Zhang, A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM. J. Optim. 16 (2005) 170–192. [CrossRef] [MathSciNet] [Google Scholar]
  • W.W. Hager and H.A. Zhang, Algorithm 851: CG DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32 (2006) 113–137. [CrossRef] [Google Scholar]
  • W.W. Hager and H.A. Zhang, A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2 (2006) 35–58. [MathSciNet] [Google Scholar]
  • M.R. Hestenes and E.L. Stiefel, Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49 (1952) 409–432. [CrossRef] [Google Scholar]
  • J. Jian, Q. Chen, X. Jiang, Y. Zeng and J. Yin, A new spectral conjugate gradient method for large-scale unconstrained optimization. Optim. Methods. Softw. 32 (2016) 503–515. [Google Scholar]
  • J. Jian, P. Liu, X. Jiang and C. Zhang, Two classes of spectral conjugate gradient methods for unconstrained optimizations. J. Appl. Math. Comput. 68 (2022) 4435–4456. [CrossRef] [MathSciNet] [Google Scholar]
  • C.X. Kou and Y.H. Dai, A modified self-scaling memoryless Broyden–Fletcher–Goldfarb–Shanno method for unconstrained optimization. J. Optim. Theory. Appl. 165 (2015) 209–224. [CrossRef] [MathSciNet] [Google Scholar]
  • Y. Liu and C. Storey, Efficient generalized conjugate gradient algorithms. I. Theory J. Optim. Theory Appl. 69 (1991) 129–137. [CrossRef] [MathSciNet] [Google Scholar]
  • I.E. Livieris, V. Tampakas and P. Pintelas, A descent hybrid conjugate gradient method based on the memoryless BFGS update. Numer. Algorithms 79 (2018) 1169–1185. [CrossRef] [MathSciNet] [Google Scholar]
  • M. Lotfi and S.M. Hosseini, An efficient Dai–Liao type conjugate gradient method by reformulating the CG parameter in the search direction equation. J. Comput. Appl. Math. 371 (2020) 112708. [CrossRef] [MathSciNet] [Google Scholar]
  • E. Polak and G. Ribière, Note sur la convergence de methods de directions conjugees. Rev. Fr. Inform. Rech. Oper. 16 (1952) 35–43. [Google Scholar]
  • E.T. Polyak, The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9 (1969) 94–112. [CrossRef] [Google Scholar]
  • M. Raydan, The Barzilai and Borwein gradient method for the large-scale unconstrained minimization problem. SIAM. J. Optim. 7 (1997) 26–33. [CrossRef] [MathSciNet] [Google Scholar]
  • L. Wang, W. Sun, R.J.B. Sampaio and J. Yuan, A Barzilai and Borwein scaling conjugate gradient method for unconstrained optimization problems. Appl. Math. Comput. 262 (2015) 136–144. [MathSciNet] [Google Scholar]
  • G. Yu, J. Huang and Y. Zhou, A descent spectral conjugate gradient method for impulse noise removal. Appl. Math. Lett. 23 (2010) 555–560. [CrossRef] [MathSciNet] [Google Scholar]
  • Y. Zhang and B. Dan, An efficient adaptive scaling parameter for the spectral conjugate gradient method. Optim. Lett. 10 (2016) 119. [CrossRef] [MathSciNet] [Google Scholar]
  • L. Zhang, W. Zhou and D.H. Li, A descent modified Polak–Ribiére–Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26 (2006) 629–640. [CrossRef] [MathSciNet] [Google Scholar]
  • L. Zhang, W. Zhou and D.H. Li, Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods. Softw. 22 (2007) 697–711. [CrossRef] [MathSciNet] [Google Scholar]
  • G. Zoutendijk, Nonlinear programming, computational methods, in Integer and Nonlinear Programming, edited by J. Abadie. Amsterdam, North-Holland (1970) 37–86. [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.