On the Sample Complexity of Consistent Learning with One-Sided Error
スポンサーリンク
概要
- 論文の詳細を見る
Although consistent learning is sufficient for PAC-learning, it has not been found what strategy makes learning more efficient, especially on the sample complexity, i.e., the number of examples required. For the first step towards this problem, classes that have consistent learning algorithms with one-sided error are considered. A combinatorial quantity called maximal particle sets is introduced, and an upper bound of the sample complexity of consistent learning with one-sided error is obtained in terms of maximal particle sets. For the class of n-dimensional axis-parallel rectangles, one of those classes that are consistently learnable with one-sided error, the cardinality of the maximal particle set is estimated and O(d/ε+1/εlog 1/δ) upper bound of the learning algorithm for the class is obtained. This bound improves the bounds due to Blumer et al. [2] and meets the lower bound within a constant factor.
- 社団法人電子情報通信学会の論文
- 1995-05-25
著者
-
Takimoto E
Tohoku Univ. Sendai‐shi Jpn
-
Takimoto Eiji
Graduate School Of Information Sciences Tohoku University
-
Maruoka A
Tohoku Univ. Sendai‐shi Jpn
-
Maruoka Akira
Graduate School Of Information Sciences Tohoku University
関連論文
- DS-1-2 On Formula Size Lower Bounds for Synthesis of Boolean Functions over Disjoint Sets of Variables
- An On-Line Prediction Algorithm Combining Several Prediction Strategies in the Shared Bet Model
- Learning k -Term Monotone Boolean Formulae
- Online Allocation with Risk Information(Invited Papers from New Horizons in Computing)
- Relationships between Horn Formulas and XOR-MDNF Formulas(Foundations of Computer Science)
- On the Sample Complexity of Consistent Learning with One-Sided Error
- Proper learning algorithm for functions of $k$ terms under smooth distributions