A Recurrent Log-Linearized Gaussian Mixture Network
スポンサーリンク
概要
- 論文の詳細を見る
Context in time series is one of the most useful andinteresting characteristics for machine learning. In some cases, thedynamic characteristic would be the only basis for achieving a possibleclassification. A novel neural network, which is named "a recurrentlog-linearized Gaussian mixture network (R-LLGMN)," isproposed in this paper for classification of time series. The structureof this network is based on a hidden Markov model (HMM),which has been well developed in the area of speech recognition.R-LLGMN can as well be interpreted as an extension of a probabilisticneural network using a log-linearized Gaussian mixturemodel, in which recurrent connections have been incorporated tomake temporal information in use. Some simulation experimentsare carried out to compare R-LLGMN with the traditional estimatorof HMM as classifiers, and finally, pattern classification experimentsfor EEG signals are conducted. It is indicated from theseexperiments that R-LLGMN can successfully classify not only artificialdata but real biological data such as EEG signals.
- IEEEの論文
- 2003-00-00
IEEE | 論文
- Magnetic and Transport Properties of Nb/PdNi Bilayers
- Supersonic Ion Beam Driven by Permanent-Magnets-Induced Double Layer in an Expanding Plasma
- Surfactant Adsorption on Single-Crystal Silicon Surfaces in TMAH Solution: Orientation-Dependent Adsorption Detected by In Situ Infrared Spectroscopy
- Extended-range FMCW reflectometry using an optical loop with a frequency shifter
- Teachingless spray-painting of sculptured surface by an industrial robot