Rapid Learning Method for Multilayered Neural Networks Using Two-Dimensional Conjugate Gradient Search
スポンサーリンク
概要
- 論文の詳細を見る
Back-propagation learning in multilayered neural networks is based on the principle of steepest descent This method calculates the gradient of the error function in the reverse mode of automatic differentiation. The present paper first summarizes automatic differentiation and two-dimensional conjugate gradient search based on automatic differentiation. A new learning method using two-dimensionl conjugate gradient search is then proposed for neural networks. The proposed method automatically controls the learning rate and the momentum factor of the back-propagation. It requires computation of the quadratic forms of the Hessian of the error function. The computation time and the memory storage are proportional to the square of the size of the neural net-work if all the components of the Hessian are Computed. However, in the forward mode of automatic differentiation, the quadratic forms are computed at a cost proportional only to the size of the neural network. Numerical experiments show that the number of iterations is much smaller than for back-propagation, while the time taken for one iteration is about three times that in back-propagation.
- 一般社団法人情報処理学会の論文
- 1992-03-31
著者
-
Yoshida T
Department Of Computer Science Gunma University
-
YOSHIDA TOSHINOBU
Department of Computer Science, Gunma University
関連論文
- Rapid Learning Method for Multilayered Neural Networks Using Two-Dimensional Conjugate Gradient Search
- Partition coefficients of chlorobenzenes in heptane-water and heptane-acetonitrile systems.