Boosting Learning Algorithm for Pattern Recognition and Beyond
スポンサーリンク
概要
- 論文の詳細を見る
This paper discusses recent developments for pattern recognition focusing on boosting approach in machine learning. The statistical properties such as Bayes risk consistency for several loss functions are discussed in a probabilistic framework. There are a number of loss functions proposed for different purposes and targets. A unified derivation is given by a generator function U which naturally defines entropy, divergence and loss function. The class of U-loss functions associates with the boosting learning algorithms for the loss minimization, which includes AdaBoost and LogitBoost as a twin generated from Kullback-Leibler divergence, and the (partial) area under the ROC curve. We expand boosting to unsupervised learning, typically density estimation employing U-loss function. Finally, a future perspective in machine learning is discussed.
- 2011-10-01
著者
-
EGUCHI Shinto
The Institute of Statistical Mathematics
-
Komori Osamu
The Institute Of Statistical Mathematics
関連論文
- Identifying haplotype block structure using an ancestor-derived model
- Boosting Learning Algorithm for Pattern Recognition and Beyond