Approximate Reduction from AUC Maximization to 1-norm Soft Margin Optimization (情報論的学習理論と機械学習)
スポンサーリンク
概要
- 論文の詳細を見る
Finding linear classifiers that maximize AUC scores is important in ranking research. This is naturally formulated as a 1-norm hard/soft margin optimization problem over pn pairs of p positive and n negative instances. However, directly solving the optimization problems is impractical since the problem size (pn) is quadratically larger than the given sample size (p+n). In this paper, we give (approximate) reductions from the problems to hard/soft margin optimization problems of linear size. First, for the hard margin case, we show that the problem is reduced to a hard margin optimization problem over p+n instances in which the bias constant term is to be optimized. Then, for the soft margin case, we show that the problem is approximately reduced to a soft margin optimization problem over p+n instances for which the resulting linear classifier is guaranteed to have a certain margin over pairs.
- 2011-11-02
著者
-
Takimoto Eiji
Department Of Informatics Kyushu University
-
Suehiro Daiki
Department Of Informatics Kyushu University
-
Hatano Kohei
Department Of Informatics Kyushu University
-
Takimoto Eiji
Dep. Of Informatics Kyushu Univ.
関連論文
- Lower Bounds on Quantum Query Complexity for Read-Once Formulas with XOR and MUX Operators
- NPN-Representatives of a Set of Optimal Boolean Formulas
- Approximate Reduction from AUC Maximization to 1-norm Soft Margin Optimization (情報論的学習理論と機械学習)
- Adaptive Online Prediction Using Weighted Windows
- Approximate Reduction from AUC Maximization to 1-norm Soft Margin Optimization