Improving Convergence of Backpropagation Learning using Exponential Cost Function
スポンサーリンク
概要
- 論文の詳細を見る
Backpropagation, one of the most popular learning algorithms in multi-layered feedforward neural networks, suffers from the drawback of slow convergence. Several modifications have been proposed to accelerate the learning process using different techniques. In this paper, a new cost function expressed as exponential of sum-squared or Log-likelihood is proposed. Weight update using this modification varies the learning rate parameter dynamically during training as opposed to constant learning rate parameter used in standard Backpropagation. Simulation results with different problems demonstrate significant improvement in the learning speed of Backpropagation algorithm.
- 社団法人 電気学会の論文
- 2003-05-01
著者
-
Kamruzzaman Joarder
Faculty Of Information Technology Monash University Australia
-
Kamruzzaman Joarder
Faculty Of Information Technology Monash University
関連論文
- Application of Support Vector Machine to Forex Monitoring
- Improving Convergence of Backpropagation Learning using Exponential Cost Function
- Arctangent Activation Function to Accelerate Backpropagation Learning