Power Law Slowdown of the Neural Learning
スポンサーリンク
概要
- 論文の詳細を見る
We numerically show that the learning time t of the back propagation model with the encoder topology obeys a power law described as t ∝ M^D(D : constant, 1 < D ≲ 2), with M being the number of input patterns.
- 社団法人電子情報通信学会の論文
- 1994-12-25
著者
-
Fuchikami Nobuko
Faculty Of Science Tokyo Metropolitan University
-
Nakajima Tatsuhiro
Faculty Of Science Tokyo Metropolitan University
-
Nakajima Tatsuhiro
Faculty Of Economics Meikai University
-
Cateau Hideyuki
Laboratory for Neural Modeling, Institute of Physical and Chemical Science Research (RIKEN)
-
Nunokawa Hiroshi
National Laboratory for High Energy Physics (KEK)
-
Cateau H
Laboratory For Neural Modeling Institute Of Physical And Chemical Science Research (riken)
-
Nunokawa H
Japan Atomic Energy Res. Inst. Ibaraki
関連論文
- Generalized Cyclic Characters of Symmetric Groups
- MECHANISM OF LIPID LATERAL DIFFUSION IN LIPID BILAYER MEMBRANES
- Power Law Slowdown of the Neural Learning
- On reduced Q-functions