Search Descriptions

General

Neural machine Translation

Statistical Machine Translation

Search Publications


author

title

other

year

Pruning Large Translation Models

Physical requirements may limit the size of translation models that can be used in practice, so we may have to prune the models by removing the least relevant phrase pairs.

Pruning Large Models is the main subject of 23 publications. 11 are discussed here.

Publications

Quirk and Menezes (2006) argue that extracting only minimal phrases, i.e. the smallest phrase pairs that map each entire sentence pair, does not hurt performance. This is also the basis of the n-gram translation model (Mariño et al., 2006; Costa-jussà et al., 2007), a variant of the phrase-based model.
Discarding unlikely phrase pairs based on significance tests on their more-than-random occurrence reduces the phrase table drastically and may even yield increases in performance (Johnson et al., 2007). Wu and Wang (2007) propose a method for filtering the noise in the phrase translation table based on a log likelihood ratio. Kutsumi et al. (2005) uses a support vector machine for cleaning phrase tables.
Such considerations may also be taking into account in second pass phrase extraction stage that does not extract bad phrase pairs (Zettlemoyer and Moore, 2007). When faced with porting phrase-based models to small devices such as PDAs (Zhang and Vogel, 2007), the translation table has to be reduced to fit a fixed amount of memory. Eck et al. (2007); Eck et al. (2007b) prune the translation table based on how often a phrase pair was considered during decoding and how often it was used in the best translation. Sanchis-Trilles et al. (2011) retranslate the training corpus to re-estimate the phrase table from phrases used in the best derivations, reducing the size of the phrase table vastly with some loss of quality.

Benchmarks

Discussion

Related Topics

New Publications

  • Xu et al. (2013)
  • Eck et al. (2005)
  • Martzoukos et al. (2014)
  • Martzoukos et al. (2014)
  • Ling et al. (2012)
  • Zens et al. (2012)
  • Lee et al. (2012)
  • Johnson (2012)
  • Tomeh et al. (2009)
  • He et al. (2009)
  • Iglesias et al. (2009)
  • Germann et al. (2009)
  • Eck et al. (2005)

Actions

Download

Contribute