Superlinear Conjugate Gradient Method with Adaptable Step Length and Constant Momentum Term
スポンサーリンク
概要
- 論文の詳細を見る
First order line search optimization techniques gained essential practical importance over second order optimization techniques due to their computational simplicity and low memory requirements. The computational excess of second order methods becomes unbearable for large optimization tasks. The only applicable optimization techniques in such cases are variations of first order approaches. This article presents one such variation of first order line search optimization technique. The presented algorithm has substantially simplified a line search subproblem into a single step calculation of the appropriate value of step length. This remarkably simplifies the implementation and computational complexity of the line search subproblem and yet does not harm the stability of the method. The algorithm is theoretically proven convergent, with superlinear convergence rates, and exactly classified within the formerly proposed classification framework for first order optimizatin [1]. Performance of the proposed algorithm is practically evaluated on five data sets and compared to the relevant standard first order optimization technique. The results indicate superior performance of the presented algorithm over the standard first order method.
- 社団法人電子情報通信学会の論文
- 2000-11-25
著者
-
Usui Shiro
The Department Of Information And Computer Sciences Toyohashi University Of Technology
-
GECZY Peter
the Department of Information and Computer Sciences, Toyohashi University of Technology
-
Geczy Peter
The Department Of Information And Computer Sciences Toyohashi University Of Technology
関連論文
- Superlinear Conjugate Gradient Method with Adaptable Step Length and Constant Momentum Term
- Novel First Order Optimization Classification Framework