Memory Superimposition by Backpropagation Neural Networks
スポンサーリンク
概要
- 論文の詳細を見る
We propose a novel neural network for incremental learning tasks where networks are required to learn new knowledge without forgetting the old one. An essential core of the proposed neural learning structure is a transferring scheme from short-term memory (STM) into long-term memory (LTM) as in brains by using dynamic changing weights. As the number of LTMs increases, a new network structure is superimposed on the previous one without disturbing the past LTMs by introducing a lateral inhibition mechanism. Superiority of the proposed neural structure to the conventional backpropagation networks is proven with respect to the learning ability.Neural networksincremental learningpattern classification and long-term memory
- 東北大学医療技術短期大学部の論文
- 2003-07-31
東北大学医療技術短期大学部 | 論文
- Effects of Interferon-α and -γ on Diaphragm Muscle in Rats
- Core-Orthogonalization Effects on the Momentum Density Distribution and the Compton profile of Valence Electrons in Semiconductors
- Linearity in Compton Scattering B(γ)Function of Semiconductors to Ionicity in Valence Electronic Bond
- Study of Valence Electronic Bond Character through Kinematical Structure Parameters in Compton Scattering B(γ)-Function of Semiconductors
- Properties of Nonlocal Pseudopotentials of Si and Ge Optimized under Full Interdependence among Potential Parameters