Improving the Accuracy of Least-Squares Probabilistic Classifiers
スポンサーリンク
概要
- 論文の詳細を見る
The least-squares probabilistic classifier (LSPC) is a computationally-efficient alternative to kernel logistic regression. However, to assure its learned probabilities to be non-negative, LSPC involves a post-processing step of rounding up negative parameters to zero, which can unexpectedly influence classification performance. In order to mitigate this problem, we propose a simple alternative scheme that directly rounds up the classifiers negative outputs, not negative parameters. Through extensive experiments including real-world image classification and audio tagging tasks, we demonstrate that the proposed modification significantly improves classification accuracy, while the computational advantage of the original LSPC remains unchanged.
論文 | ランダム
- Lobatto-Gaussの数値積分公式の分点と重率の決定
- 7103 奇関数の最良近似有理式を求めるプログラム
- 7102 偶関数の最良近似有理式を求めるプログラム
- FORTRANにおける最適化と問題点
- 6907. 有理式の最良近似式を求めるプログラム