Adaptation of reordering models for statistical machine translation

  1. (PDF, 272 KB)
AuthorSearch for: ; Search for: ; Search for:
Proceedings titleProceedings of the 2013 North America Chapter of the Association for Computational Linguistics : Human Language Technologies
Conference2013 North American Chapter of the Association for Computational Linguistics: Human Language Technologies, June 9 - 15, 2013, Atlanta, GA
Pages938946; # of pages: 9
AbstractPrevious research on domain adaptation (DA) for statistical machine translation (SMT) has mainly focused on the translation model (TM) and the language model (LM). To the best of our knowledge, there is no previous work on reordering model (RM) adaptation for phrasebased SMT. In this paper, we demonstrate that mixture model adaptation of a lexicalized RM can significantly improve SMT performance, even when the system already contains a domain-adapted TM and LM. We find that, surprisingly, different training corpora can vary widely in their reordering characteristics for particular phrase pairs. Furthermore, particular training corpora may be highly suitable for training the TM or the LM, but unsuitable for training the RM, or vice versa, so mixture weights for these models should be estimated separately. An additional contribution of the paper is to propose two improvements to mixture model adaptation: smoothing the in-domain sample, and weighting instances by document frequency. Applied to mixture RMs in our experiments, these techniques (especially smoothing) yield significant performance improvements.
Publication date
AffiliationNational Research Council Canada; Information and Communication Technologies
Peer reviewedYes
NPARC number21270980
Export citationExport as RIS
Report a correctionReport a correction
Record identifiere8bb11c8-a4ab-42c6-ab3b-ad7f3c9f1591
Record created2014-02-20
Record modified2016-05-09
Bookmark and share
  • Share this page with Facebook (Opens in a new window)
  • Share this page with Twitter (Opens in a new window)
  • Share this page with Google+ (Opens in a new window)
  • Share this page with Delicious (Opens in a new window)