Class-Dependent Modeling for Dialog Translation
スポンサーリンク
概要
- 論文の詳細を見る
This paper presents a technique for class-dependent decoding for statistical machine translation (SMT). The approach differs from previous methods of class-dependent translation in that the class-dependent forms of all models are integrated directly into the decoding process. We employ probabilistic mixture weights between models that can change dynamically on a sentence-by-sentence basis depending on the characteristics of the source sentence. The effectiveness of this approach is demonstrated by evaluating its performance on travel conversation data. We used this approach to tackle the translation of questions and declarative sentences using class-dependent models. To achieve this, our system integrated two sets of models specifically built to deal with sentences that fall into one of two classes of dialog sentence: questions and declarations, with a third set of models built with all of the data to handle the general case. The technique was thoroughly evaluated on data from 16 language pairs using 6 machine translation evaluation metrics. We found the results were corpus-dependent, but in most cases our system was able to improve translation performance, and for some languages the improvements were substantial.
著者
関連論文
- Class-Dependent Modeling for Dialog Translation
- Class-Dependent Modeling for Dialog Translation
- Using Mutual Information Criterion to Design an Efficient Phoneme Set for Chinese Speech Recognition
- Translation of Untranslatable Words-Integration of Lexical Approximation and Phrase-Table Extension Techniques into Statistical Machine Translation
- Multiple Translation-Engine-based Hypotheses and Edit-Distance-based Rescoring for a Greedy Decoder for Statistical Machine Translation(Natural-Language Processing)
- A Bayesian Model of Transliteration and Its Human Evaluation When Integrated into a Machine Translation System