Novel First Order Optimization Classification Framework
スポンサーリンク
概要
- 論文の詳細を見る
Numerous scientific and engineering fields extensively utilize optimization techniques for finding appropriate parameter values of models. Various optimization methods are available for practical use. The optimization algorithms are classified primarily due to the rates of convergence. Unfortunately, it is often the case in practice that the particular optimization method with specified convergence rates performs substantially differently on diverse optimization tasks. Theoretical classification of convergence rates then lacks its relevance in the context of the practical optimization. It is therefore desirable to formulate a novel classification framework relevant to the theoretical concept of convergence rates as well as to the practical optimization. This article introduces such classification framework. The proposed classification framework enables specification of optimization techniques and optimization tasks. It also underlies its inherent relationship to the convergence rates. Novel classification framework is applied to categorizing the tasks of optimizing polynomials and the problem of training multilayer perceptron neural networks.
- 社団法人電子情報通信学会の論文
- 2000-11-25
著者
-
Usui Shiro
The Department Of Information And Computer Sciences Toyohashi University Of Technology
-
Geczy Peter
The Department Of Information And Computer Sciences Toyohashi University Of Technology
関連論文
- Superlinear Conjugate Gradient Method with Adaptable Step Length and Constant Momentum Term
- Novel First Order Optimization Classification Framework