Superfast-Trainable Multi-Class Probabilistic Classifier by Least-Squares Posterior Fitting
スポンサーリンク
概要
- 論文の詳細を見る
Kernel logistic regression (KLR) is a powerful and flexible classification algorithm, which possesses an ability to provide the confidence of class prediction. However, its training—typically carried out by (quasi-)Newton methods—is rather time-consuming. In this paper, we propose an alternative probabilistic classification algorithm called Least-Squares Probabilistic Classifier (LSPC). KLR models the class-posterior probability by the log-linear combination of kernel functions and its parameters are learned by (regularized) maximum likelihood. In contrast, LSPC employs the linear combination of kernel functions and its parameters are learned by regularized least-squares fitting of the true class-posterior probability. Thanks to this linear regularized least-squares formulation, the solution of LSPC can be computed analytically just by solving a regularized system of linear equations in a class-wise manner. Thus LSPC is computationally very efficient and numerically stable. Through experiments, we show that the computation time of LSPC is faster than that of KLR by two orders of magnitude, with comparable classification accuracy.
- (社)電子情報通信学会の論文
- 2010-10-01
著者
関連論文
- Statistical active learning for efficient value function approximation in reinforcement learning (ニューロコンピューティング)
- Lighting Condition Adaptation for Perceived Age Estimation
- Computationally Efficient Multi-task Learning with Least-squares Probabilistic Classifiers
- A Unified Framework of Density Ratio Estimation under Bregman Divergence
- Adaptive importance sampling with automatic model selection in value function approximation (ニューロコンピューティング)
- Improving Model-based Reinforcement Learning with Multitask Learning
- Improving Model-based Reinforcement Learning with Multitask Learning
- Least-Squares Conditional Density Estimation
- Direct Importance Estimation with a Mixture of Probabilistic Principal Component Analyzers
- カーネル密度比推定の統計的解析(学習問題の解析,テキスト・Webマイニング,一般)
- A Semi-Supervised Approach to Perceived Age Prediction from Face Images
- Conditional Density Estimation Based on Density Ratio Estimation
- Conditional Density Estimation Based on Density Ratio Estimation
- A density ratio approach to two-sample test (パターン認識・メディア理解)
- A density ratio approach to two-sample test (情報論的学習理論と機械学習)
- Theoretical Analysis of Density Ratio Estimation
- FOREWORD
- Superfast-Trainable Multi-Class Probabilistic Classifier by Least-Squares Posterior Fitting
- Direct Importance Estimation with Gaussian Mixture Models
- Improving the Accuracy of Least-Squares Probabilistic Classifiers
- Artist agent A[2]: stroke painterly rendering based on reinforcement learning (パターン認識・メディア理解)
- Artist agent A[2]: stroke painterly rendering based on reinforcement learning (情報論的学習理論と機械学習)
- Least-Squares Independence Test
- Density Difference Estimation
- Winning the Kaggle Algorithmic Trading Challenge with the Composition of Many Models and Feature Engineering
- Artist Agent: A Reinforcement Learning Approach to Automatic Stroke Generation in Oriental Ink Painting
- Early stopping Heuristics in Pool-Based Incremental Active Learning for Least-Squares Probabilistic Classifier
- Computationally Efficient Multi-Label Classification by Least-Squares Probabilistic Classifiers
- Multi-Task Approach to Reinforcement Learning for Factored-State Markov Decision Problems
- Constrained Least-Squares Density-Difference Estimation
- A Density-ratio Framework for Statistical Data Processing
- Computationally Efficient Multi-task Learning with Least-squares Probabilistic Classifiers
- Model-Based Policy Gradients with Parameter-Based Exploration by Least-Squares Conditional Density Estimation
- A Density-ratio Framework for Statistical Data Processing
- FOREWORD
- On Kernel Parameter Selection in Hilbert-Schmidt Independence Criterion