Stability Analysis of Attractor Neural Network Model of Inferior Temporal Cortex —Relationship between Attractor Stability and Learning Order—
スポンサーリンク
概要
- 論文の詳細を見る
Miyashita found that the long-term memory of visual stimuli is stored in the monkey's inferior temporal cortex and that the temporal correlation in terms of the learning order of visual stimuli is converted into spatial correlation in terms of the firing rate patterns of the neuron group. To explain Miyashita's findings, Griniasty et al. [Neural Comput. 5 (1993) 1] and Amit et al. [J. Neurosci. 14 (1994) 6435] proposed the attractor neural network model, and the Amit model has been examined only for the stable state acquired by storing memory patterns in a fixed sequence. In the real world, however, the learning order has statistical continuity but it also has randomness, and the stability of the state changes depending on the statistical properties of learning order when memory patterns are stored randomly. In addition, it is preferable for the stable state to become an appropriate attractor that reflects the relationship between memory patterns by the statistical properties of the learning order. In this study, we examined the dependence of the stable state on the statistical properties of the learning order without modifying the Amit model. The stable state was found to change from the correlated attractor to the Hopfield or Mp attractor, which is the mixed state with all memory patterns when the rate of random learning increases. Furthermore, we found that if the statistical properties of the learning order change, the stable state can change to an appropriate attractor reflecting the relationship between memory patterns.
- Physical Society of Japanの論文
- 2010-06-15
著者
-
Uezu Tatsuya
Graduate School Of Human Culture Nara Women's University
-
KIMOTO Tomoyuki
Oita National College of Technology
-
Okada Masato
Graduate School Of Engineering Science Osaka University
-
Masato Okada
Graduate School of Frontier Sciences, The University of Tokyo, Kashiwa, Chiba 277-8561, Japan
-
Tomoyuki Kimoto
Oita National College of Technology, Oita 870-0152, Japan
-
Tatsuya Uezu
Graduate School of Sciences and Humanities, Nara Women's University, Nara 630-8506, Japan
関連論文
- Statistical Mechanical Study of Code-Division Multiple-Access Multiuser Detectors : Analysis of Replica Symmetric and One-Step Replica Symmetry Breaking Solutions(General)
- Statistical Mechanical Analysis of CDMA Multiuser Detectors : AT Stability and Entropy of the RS Solution, and 1RSB Solution
- Retrieval Property of Attractor Network with Synaptic Depression(General)
- Analysis of Ensemble Learning Using Simple Perceptrons Based on Online Learning Theory
- Retrieval of Branching Sequences in an Associative Memory Model with Common External Input and Bias Input(Cross-disciplinary physics and related areas of science and technology)
- Residual Energies after Slow Quantum Annealing(General)
- On-line Learning of Perceptron from Noisy Data by One and Two Teachers(General)
- Multiple Stability of a Sparsely Encoded Attractor Neural Network Model for the Inferior Temporal Cortex(General)
- Distinction of Coexistent Attractors in an Attractor Neural Network Model Using a Relaxation Process of Fluctuations in Firing Rates : Analysis with Statistical Mechanics(General)
- Dynamical Behavior of Phase Oscillator Networks on the Bethe Lattice(General)
- Theory of Time Domain Ensemble On-Line Learning of Perceptron under the Existence of External Noise(General)
- Statistical Mechanics of Time-Domain Ensemble Learning(General)
- Image Restoration with a Truncated Gaussian Model(General)
- Statistical Mechanics of On-line Node-perturbation Learning
- Response to Invasion by Antigens and Effects of Threshold in an Immune Network Dynamical System Model with a Small Number of Degrees of Freedom
- Analysis of an Immune Network Dynamical System Model with a Small Number of Degrees of Freedom
- Mixed states on neural network with structural learning
- Sparsely Encoded Associative Memory Model with Forgetting Process(Regular Section)
- Ensemble Learning of Linear Perceptrons : On-Line Learning Theory(General)
- A Large Scale Dynamical System Immune Network Model with Finite Connectivity(Oscillation, Chaos and Network Dynamics in Nonlinear Science)
- Analysis of XY Model with Mexican-Hat Interaction on a Circle
- Exact Inference in Discontinuous Firing Rate Estimation Using Belief Propagation
- Influence of Synaptic Depression on Memory Storage Capacity
- Neural Network Model of Spatial Memory: Associative Recall of Maps
- Statistical Mechanics of On-Line Mutual Learning with Many Linear Perceptrons
- Statistical Mechanics of Mutual Learning with a Latent Teacher(General)
- Mean Field Analysis of Stochastic Neural Network Models with Synaptic Depression
- On the Conditions for the Existence of Perfect Learning and Power Law Behaviour in Learning from Stochastic Examples by Ising Perceptrons
- Learning of Non-Monotonic Target Functions by Ising Perceptrons : Learning Curve, Perfect Learning and Perfect Anti-Learning (General)
- Rate Reduction for Associative Memory Model n Hodgkin-Huxley-Type Network(Cross-disciplinary physics and related areas of science and technology)
- Higher Order Effects on Rate Reduction for Networks of Hodgkin-Huxley Neurons(Cross-disciplinary physics and related areas of science and technology)
- Online Learning of Perceptron from Noisy Data: A Case in which Both Student and Teacher Suffer from External Noise
- Estimation of Intracellular Calcium Ion Concentration by Nonlinear State Space Modeling and Expectation-Maximization Algorithm for Parameter Estimation
- Neural Network Model with Discrete and Continuous Information Representation
- Retrieval Properties of Hopfield and Correlated Attractors in an Associative Memory Model (General)
- Estimating Membrane Resistance over Dendrite Using Markov Random Field
- Stability Analysis of Stochastic Neural Network with Depression and Facilitation Synapses
- Analysis of XY Model with Mexican-Hat Interaction on a Circle ---Derivation of Saddle Point Equations and Study of Bifurcation Structure---
- Solvable Model of a Phase Oscillator Network on a Circle with Infinite-Range Mexican-Hat-Type Interaction
- Multiple Stability of a Sparsely Encoded Attractor Neural Network Model for the Inferior Temporal Cortex
- Image Segmentation and Restoration Using Switching State-Space Model and Variational Bayesian Method
- Statistical Mechanical Analysis of CDMA Multiuser Detectors : AT Stability and Entropy of the RS Solution, and 1RSB Solution
- Stability Analysis of Attractor Neural Network Model of Inferior Temporal Cortex —Relationship between Attractor Stability and Learning Order—
- Bayesian Image Restoration for Medical Images Using Radon Transform
- Statistical Mechanics of Mexican-Hat-Type Horizontal Connection
- Statistical Mechanics of Node-Perturbation Learning for Nonlinear Perceptron
- Learning from Stochastic Rules by Spherical Perceptrons under Finite Temperature ---Optimal Temperature and Asymptotic Learning Curve---
- Inter-Layer Correlation in a Feed-Forward Network with Intra-Layer Common Noise
- Continuous Attractor that Appears in Autoassociative Memory Model Extended to XY Spin System
- Statistical–Mechanical Analysis of Attractor Dynamics in a Hysteretic Neuron Network
- A Numerical Analysis of Learning Coefficient in Radial Basis Function Network
- Inter-Layer Correlation in a Feed-Forward Network with Intra-Layer Common Noise
- Analysis of Ensemble Learning Using Simple Perceptrons Based on Online Learning Theory
- Superconvergence of Period-Doubling Cascade in Trapezoid Maps : Its Rigorous Proof and Superconvergence of the Period-Doubling Cascade Starting from a Period p Solution