Batch tuning strategies for statistical machine translation

Download
  1. (PDF, 294 KB)
AuthorSearch for: ; Search for:
TypeArticle
Proceedings titleProceedings of the 2012 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies - (HLT-NAACL 2012)
ConferenceNorth American Chapter of the Association for Computational Linguistics: Human Language Technologies - (HLT-NAACL 2012), Montreal, QC, June 3-8, 2012
Article numberN12-1047
Pages427436; # of pages: 10
AbstractThere has been a proliferation of recent work on SMT tuning algorithms capable of handling larger feature sets than the traditional MERT approach. We analyze a number of these algorithms in terms of their sentence-level loss functions, which motivates several new approaches, including a Structured SVM. We perform empirical comparisons of eight different tuning strategies, including MERT, in a variety of settings. Among other results, we find that a simple and efficient batch version of MIRA performs at least as well as training online, and consistently outperforms other options.
Publication date
PublisherAssociation for Computational Linguistics
LanguageEnglish
AffiliationInformation and Communication Technologies; National Research Council Canada
Peer reviewedYes
NPARC number20262877
Export citationExport as RIS
Report a correctionReport a correction
Record identifier1101df04-9f92-4758-a257-3a8457183e06
Record created2012-07-10
Record modified2016-05-09
Bookmark and share
  • Share this page with Facebook (Opens in a new window)
  • Share this page with Twitter (Opens in a new window)
  • Share this page with Google+ (Opens in a new window)
  • Share this page with Delicious (Opens in a new window)