A NEW NONLINEAR CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION
スポンサーリンク
概要
- 論文の詳細を見る
Conjugate gradient methods are widely used for large scale unconstrained optimization problems. Most of conjugate gradient methods don't always generate a descent search direction, so the descent condition is usually assumed in the analysis and implementations. Dai and Yuan (1999) proposed a conjugate gradient method which generates a descent search direction at every iteration and converges globally to the solution if the Wolfe conditions are satisfied within the line search strategy. In this paper, we give a new conjugate gradient method based on the study of Dai and Yuan, and show that our method always produces a descent search direction and converges globally if the Wolfe conditions are satisfied. Moreover our method has the second-order curvature information with a higher precision which uses the modified secant condition proposed by Zhang, Deng and Chen (1999) and Zhang and Xu (2001). Our numerical results show that our method is very efficient for given standard test problems, if we make a good choice of a parameter included in our method.
- 社団法人日本オペレーションズ・リサーチ学会の論文
著者
-
Yabe Hiroshi
Tokyo University Of Science
-
Sakaiwa Naoki
Hitachi Information Systems, Ltd.
-
Sakaiwa Naoki
Hitachi Information Systems Ltd.
関連論文
- A NEW NONLINEAR CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION
- SUPERLINEAR CONVERGENCE OF THE SHENG-ZOU-BROYDEN METHOD FOR NONLINEAR LEAST SQUARES PROBLEMS