Analysis of Momentum Term in Back-Propagation
スポンサーリンク
概要
- 論文の詳細を見る
The back-propagation algorithm has been applied to many fields, and has shown large capability of neural networks. Many people use the back-propagation algorithm together with a momentum term to accelerate its convergence. However, in spite of the importance for theoretical studies, theoretical background of a momentum term has been unknown so far. First, this paper explains clearly the theoretical origin of a momentum term in the back-propagation algorithm for both a batch mode learning and a pattern-by-pattern learning. We will prove that the back-propagation algorithm having a momentum term can be derived through the following two assumptions : 1) The cost function is E^n=Σ^^n__μα^<n-μ>E_μ, where E_μ is the summation of squared error at the output layer at the μth learning time and a is the momentum coefficient. 2) The latest weights are assumed in calculating the cost function E^n. Next, we derive a simple relationship between momentum, learning rate, and learning speed and then further discussion is made with computer simulation.
- 社団法人電子情報通信学会の論文
- 1995-08-25
著者
-
Hagiwara M
Keio Univ. Yokohama‐shi Jpn
-
Hagiwara Masafumi
Faculty Of Science And Technology Keio University
-
Sato Akira
Kyoto Technology Center, SEKISUI CHEMICAL CO., LTD.
-
Sato Akira
Kyoto Technology Center Sekisui Chemical Co. Ltd.
関連論文
- Parallel-Hierarchical Neural Network for 3D Object Recognition
- A Multi-Winner Associative Memory
- Analysis of Momentum Term in Back-Propagation
- Quick Learning for Bidirectional Associative Memory (Special Issue on Neurocomputing)