Differential and Algebraic Geometry of Multilayer Perceptrons : Special Section on the 10th Anniversary of Trans. Fundamentals : Last Decade and 21st Century
スポンサーリンク
概要
- 論文の詳細を見る
Information geometry is applied to the manifold of neural networks called multilayer perceptrons. It is important to study a total family of networks as a geometrical manifold, because learning is represented by a trajectory in such a space. The manifold of perceptrons has a rich differential-geometrical structure represented by a Riemannian metric and singularities. An efficient learning method is proposed by using it. The parameter space of perceptrons includes a lot of algebraic singularities, which affect trajectories of learning. Such singularities are studied by using simple models. This poses an interesting problem of statistical inference and learning in hierarchical models including singularities.
- 社団法人電子情報通信学会の論文
- 2001-01-01
著者
-
Amari S
Riken Brain Sci. Inst. Saitama Jpn
-
Amari Shun-ichi
The Authors Are With Riken Brain Science Institute
-
Ozeki Tomoko
The Authors Are With Riken Brain Science Institute
関連論文
- Single-Trial Magnetoencephalographic Data Decomposition and Localization Based on Independent Component Analysis Approach
- Neural Network Models for Blind Separation of Time Delayed and Convolved Signals
- Equivariant nonstationary source separation
- Natural Gradient Learning for Spatio-Temporal Decorrelation:Recurrent Network
- Differential and Algebraic Geometry of Multilayer Perceptrons : Special Section on the 10th Anniversary of Trans. Fundamentals : Last Decade and 21st Century