ロータ連想記憶の勾配降下学習
スポンサーリンク
概要
- 論文の詳細を見る
Complex-valued Associative Memory (CAM) is an extended model of Hopfield Associative Memory (HAM). The fundamental elements, such as input-output signals and connection weights of the CAM are extended to complex numbers. The CAM can deal with multi-states information. Rotor Associative Memory (RAM) is an extended model of the CAM. Rotor neurons are essentially equivalent to complex-valued neurons. Connection weights of the RAM are expressed by two by two matrices. Only hebb rule has been proposed for the learning of the RAM. Its storage capacity is small, so advanced learning methods are necessary. In this paper, we propose gradient descent learning rule for the RAM (GDR RAM). It is based on that for the CAM (GDR CAM) proposed by Lee. We solved the learning rule and performed computer simulations to compare the GDR CAM and the GDR RAM. At last, it turned out that the storage capacity of the GDR RAM is approximately twice as much as that of the GDR CAM and the noise robustness of the GDR RAM is much better than that of the GDR CAM.
著者
関連論文
- ロータ連想記憶による偽記憶の削減 (特集 バイオエレクトロニクスと生体情報工学)
- ロータ連想記憶の勾配降下学習
- ロータ連想記憶による偽記憶の削減
- 外積を利用した3次元連想記憶
- ロータ連想記憶の勾配降下学習
- ロータ連想記憶による偽記憶の削減