A new spectral conjugate gradient method with descent condition and global convergence property for unconstrained optimization
Abstract
The Spectral conjugate gradient method is an efficient method for solving large-scale unconstrained optimization problems. In this paper, we propose a new spectral conjugate gradient method in which performance is analyzed numerically. We establish the descent condition and global convergence property under some assumptions and the strong Wolfe line search. Numerical experiments to evaluate the method’s efficiency are conducted using 98 problems with various dimensions and initial points. The numerical results based on the number of iterations and central processing unit time show that the new method has a high performance computational.
Copyright ©2024 JMCS