Improving Automatic English Writing Assessment Using Regression Trees and Error-Weighting
スポンサーリンク
概要
- 論文の詳細を見る
The proposed automated scoring system for English writing tests provides an assessment result including a score and diagnostic feedback to test-takers without humans efforts. The system analyzes an input sentence and detects errors related to spelling, syntax and content similarity. The scoring model has adopted one of the statistical approaches, a regression tree. A scoring model in general calculates a score based on the count and the types of automatically detected errors. Accordingly, a system with higher accuracy in detecting errors raises the accuracy in scoring a test. The accuracy of the system, however, cannot be fully guaranteed for several reasons, such as parsing failure, incompleteness of knowledge bases, and ambiguous nature of natural language. In this paper, we introduce an error-weighting technique, which is similar to term-weighting widely used in information retrieval. The error-weighting technique is applied to judge reliability of the errors detected by the system. The score calculated with the technique is proven to be more accurate than the score without it.
- (社)電子情報通信学会の論文
- 2010-08-01
著者
-
Lee Kong-joo
Dept. Of Cse Ewha Womans Univ.
-
Kim Jee-eun
Dept. Of English Linguistics Hankuk Univ. Of Foreign Studies
-
Lee Kong-joo
Dept. Of Information & Communication Engineering Chungnam National Univ.
関連論文
- Normalizing Syntactic Structure Using Part-of-Speech Tags and Binary Rules( Development of Advanced Computer Systems)
- Extracting Partial Parsing Rules from Tree-Annotated Corpus : Toward Deterministic Global Parsing(Natural Language Processing)
- Improving Automatic English Writing Assessment Using Regression Trees and Error-Weighting