Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability(Artificial Intelligence and Cognitive Science)
スポンサーリンク
概要
- 論文の詳細を見る
Our purpose is to estimate conditional probabilities of output labels in multiclass classification problems. Adaboost provides highly accurate classifiers and has potential to estimate conditional probabilities. However, the conditional probability estimated by Adaboost tends to over-fit to training samples. We propose loss functions for boosting that provide shrinkage estimator. The effect of regularization is realized by shrinkage of probabilities toward the uniform distribution. Numerical experiments indicate that boosting algorithms based on proposed loss functions show significantly better results than existing boosting algorithms for estimation of conditional probabilities.
- 社団法人電子情報通信学会の論文
- 2007-12-01
著者
関連論文
- A Unified Framework of Density Ratio Estimation under Bregman Divergence
- Least-Squares Conditional Density Estimation
- カーネル密度比推定の統計的解析(学習問題の解析,テキスト・Webマイニング,一般)
- Conditional Density Estimation Based on Density Ratio Estimation
- Conditional Density Estimation Based on Density Ratio Estimation
- A density ratio approach to two-sample test (パターン認識・メディア理解)
- A density ratio approach to two-sample test (情報論的学習理論と機械学習)
- Theoretical Analysis of Density Ratio Estimation
- Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability(Artificial Intelligence and Cognitive Science)
- Multiscale Bagging and Its Applications
- Density Difference Estimation
- Constrained Least-Squares Density-Difference Estimation
- A Density-ratio Framework for Statistical Data Processing
- A Density-ratio Framework for Statistical Data Processing