Combining Local Representative Networks to Improve Learning in Complex Nonlinear Learning Systems
スポンサーリンク
概要
- 論文の詳細を見る
In fully connected Multilayer perceptron (MLP), all the hidden units are activated by samples from the whole input space. For complex problems, due to interference and cross coupling of hidden units' activations, the network needs many hidden units to represent the problem and the error surface becomes highly non-linear. Searching for the minimum is then complex and computationally expensive, and simple gradient descent algorithms usually fail. We propose a network, where the input space is partitioned into local sub-regions. Subsequently, a number of smaller networks are simultaneously trained by over-lapping subsets of the input samples. Remarkable improvement of training efficiency as well as generalization performance of this combined network are observed through various simulations.
- 社団法人電子情報通信学会の論文
- 1997-09-25
著者
-
Chakraborty G
Iwate Prefectural Univ. Iwate‐ken Jpn
-
CHAKRABORTY Goutam
University of Aizu
-
SAWADA Masayuki
University of Aizu
-
NOGUCHI Shoichi
University of Aizu