TRADING MONOTONICITY DEMANDS VERSUS EFFICIENCY
スポンサーリンク
概要
- 論文の詳細を見る
The present paper deals with the learnability of indexed families $ \mathcal{L} $ of uniformly recursive languages from positive data. We consider the influence of three monotonicity demands and their dual counterparts to the efficiency of the learning process. The efficiency of learning is measured in dependence on the number of mind changes a learning algorithm is allowed to perform. The three notions of (dual) monotonicity reflect different formalizations of the requirement that the learner has to produce better and better (specializations) generalizations when fed more and more data on the target concept. We distinguish between exact learnability ($ \mathcal{L} $ has to be inferred with respect to $ \mathcal{L} $), class preserving learning ($ \mathcal{L} $ has to be inferred with respect to some suitably chosen enumeration of all the languages from $ \mathcal{L} $), and class comprising inference ($ \mathcal{L} $ has to be learned with respect to some suitably chosen enumeration of uniformly recursive languages containing at least all the languages from $ \mathcal{L} $). In particular, we prove that a relaxation of the relevant (dual) monotonicity requirement may result in an arbitrarily large speed-up. However, whether or not such a speed-up may be achieved crucially depends on the set of allowed hypothesis spaces as well as of the (dual) monotonicity demands involved.
- Research Association of Statistical Sciencesの論文
Research Association of Statistical Sciences | 論文
- CLUSTERING BY A FUZZY METRIC : APPLICATIONS TO THE CLUSTER-MEDIAN PROBLEM
- A FAMILY OF REGRESSION MODELS HAVING PARTIALLY ADDITIVE AND MULTIPLICATIVE COVARIATE STRUCTURE
- AN OPTIMAL STOPPING PROBLEM ON TREE
- ON THE ORDERS OF MAX-MIN FUNCTIONALS
- TREE EXPRESSIONS AND THEIR PRODUCT FORMULA