Principal Component Analysis by Entropy-Likelihood Optimization
スポンサーリンク
概要
- 論文の詳細を見る
This paper proposes a principal component analysis (PCA) criterion whose optimization yields the principal eigenvectors of the data correlation matrix as well as the associated eigenvalues. The corresponding learning algorithms are deduced for the unsupervised learning of one-layer linear neural networks. The part of the algorithm that estimates the principal eigenvectors turns out to be a version of the Sanger's generalized Hebbian algorithm (GHA) that enjoys adaptive learning rates and fast convergence. The proposed criterion differs with the standard PCA criteria, such as Maximum Variance and Minimum MSE, in that a) optimization of standard criteria results only in the principal eigenvectors, b) their corresponding learning algorithm, namely GHA algorithm, has a fixed learning rate. Simulation results illustrate the fast convergence of the derived algorithm.
- 一般社団法人情報処理学会の論文
- 1999-10-15
著者
-
Peper Ferdinand
Telecommunications Research Laboratory
-
SHIRAZI N.
Telecommunications Research Laboratory
-
SAWAI HIDEFUMI
Telecommunications Research Laboratory
-
Sawai H
Telecommunications Research Laboratory
-
Shirazi Mahdad
Telecommunications Research Laboratory