Can Markov models over minimal translation units help phrase-based SMT?

  • Nadir Durrani
  • , Alexander Fraser
  • , Helmut Schmid
  • , Hieu Hoang
  • , Philipp Koehn

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

48 Citations (Scopus)

Abstract

The phrase-based and N-gram-based SMT frameworks complement each other. While the former is better able to memorize, the latter provides a more principled model that captures dependencies across phrasal boundaries. Some work has been done to combine insights from these two frameworks. A recent successful attempt showed the advantage of using phrase-based search on top of an N-gram-based model. We probe this question in the reverse direction by investigating whether integrating N-gram-based translation and reordering models into a phrase-based decoder helps overcome the problematic phrasal independence assumption. A large scale evaluation over 8 language pairs shows that performance does significantly improve.

Original languageEnglish
Title of host publicationShort Papers
PublisherAssociation for Computational Linguistics (ACL)
Pages399-405
Number of pages7
ISBN (Print)9781937284510
Publication statusPublished - 2013
Externally publishedYes
Event51st Annual Meeting of the Association for Computational Linguistics, ACL 2013 - Sofia, Bulgaria
Duration: 4 Aug 20139 Aug 2013

Publication series

NameACL 2013 - 51st Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference
Volume2

Conference

Conference51st Annual Meeting of the Association for Computational Linguistics, ACL 2013
Country/TerritoryBulgaria
CitySofia
Period4/08/139/08/13

Fingerprint

Dive into the research topics of 'Can Markov models over minimal translation units help phrase-based SMT?'. Together they form a unique fingerprint.

Cite this