Communications in Mathematical Sciences
Volume 18 (2020)
A three-term conjugate gradient algorithm using subspace for large-scale unconstrained optimization
Pages: 1179 – 1190
It is well known that conjugate gradient methods are suitable for large-scale nonlinear optimization problems, due to their simple calculation and low storage. In this paper, we present a three-term conjugate gradient method using subspace technique for large-scale unconstrained optimization, in which the search direction is determined by minimizing the quadratic approximation of the objective function in a subspace which is discussed in two cases. We show the search direction can both satisfy the descent condition and Dai–Liao conjugacy condition. Under proper assumptions, global convergence result of the proposed method is established. Numerical experiments show the proposed method is efficient and robust.
subspace, three-term conjugate gradient method, global convergence, large-scale, unconstrained optimization
2010 Mathematics Subject Classification
65K05, 90C06, 90C30
This work was supported by the National Natural Science Foundation of China (11171003), by the Innovation Talent Training Program of Science and Technology of Jilin Province of China (20180519011JH), and by the Science and Technology Development Project Program of Jilin Province (20190303132SF).
Yueting Yang is the corresponding author.
Received 18 June 2019
Accepted 26 January 2020
Published 23 September 2020