トリガーとしての動詞接尾辞 : 言語獲得のための制約
スポンサーリンク
概要
- 論文の詳細を見る
In this paper we propose, within a generative framework, two kinds of predetermined constraints for first language acquisition that render the postulation of innate linguistic knowledge, i.e. Universal Grammar in the Chomskyan sense, more feasible. As a guideline for further discussion, a three-dimensional space model of language learning is introduced, which consists of linguistic data, parameters and a learning algorithm. Language acquisition can broadly be labeled a 'traverse' in this hypothetical space that a learner can in principle accomplish. Pertaining to this space, two formal theories of language learning are described, i.e. the Triggering Learning Algorithm (TLA) by Gibson and Wexler (1994) and the Markov Chain Model by Niyogi and Berwick (1997). Working within the P & P Approach of Chomsky (1981), the former formulates a formal setting of parameter values, and the latter reanalyzes the TLA as a 'memoryless' Markov process in a state space consisting of hypothetical grammars. However, given the restrictions imposed by what can and cannot be considered psycholinguistically feasible processing, two questions arise regarding these formal models - the question of linguistic adequacy and the question of developmental compatibility - especially given that these models presuppose the 'linear' transition of grammars in hypothetical space and the parsing of surface forms of sentences as triggers. Based on the multiple-grammatical properties (Yang, 2004) of the early stage of German acquisition, we show, as an alternative, a vP-Analysis with respect to the Optional Infinitive Stage (Wexler, 1994; 1998), and a developmental phase-transition dependent on features and feature-checking. The convergence to adult grammar occurs when the learner projects TP (Tense Phrase) at the final stage of acquisition, starting from the 1^<st> Phase, in which only lexical categories appear, and no functional categories, such as v (light verb) or T (tense). On this assumption, we recast the formal features and feature-checking in acquisition as built-in hard constraints, by which the sort and number of parameters and their initial distributions are restricted. The soft constraint is, on the other hand, the assignment of default values to the parameters. This conception implies the logical necessity of UG, which would allow, given the set of predetermined constraints, the learner to acquire the adult grammar quickly and correctly, although the question remains open as to how to set up the feature operations in the Markovian state space.
- 大阪外国語大学の論文
- 2006-02-16