Search Descriptions

General

Neural machine Translation

Statistical Machine Translation

Search Publications


author

title

other

year

Deep Syntactic Models

Syntactic models of language have been extended to build up more abstract and even semantic representations. Viewing this work from machine translation, if it is possible to obtain and generate from interlingual meaning representation, then the translation problem vanishes.

Deep Syntax is the main subject of 32 publications. 3 are discussed here.

Publications

Machine translation systems that use deep syntax are usually build as a series of sequential steps (Zabokrtsky et al., 2008).
On the other hand, with such complex models as starting point, probabilistic components may be added. Using traditional rule-based for components for syntactic and semantic analysis for the input and a generation component for the output, allows the training of translation models for mapping of f-structures (Riezler and Maxwell, 2006).
Cowan et al. (2006) propose a translation model that explicitly models clause structure as aligned extended projections and that is trained discriminatively.

Benchmarks

Discussion

Related Topics

New Publications

  • Ojha (2019)
  • Ojha (2019)
  • Beloucif et al. (2015)
  • Butler (2015)
  • Sulem et al. (2015)
  • Tamchyna et al. (2015)
  • Quernheim (2015)
  • Li et al. (2017)
  • Li et al. (2016)
  • Nadejde et al. (2016)
  • Tang et al. (2016)
  • Nadejde et al. (2016)
  • Pust et al. (2015)
  • Vanallemeersch and Vandeghinste (2015)
  • Wu and Palmer (2015)
  • Andreas et al. (2013)
  • Bazrafshan and Gildea (2013)
  • Rosa et al. (2013)
  • Chiang et al. (2013)
  • Zhai et al. (2013)
  • Banarescu et al. (2013)
  • Wu et al. (2010)
  • Baker et al. (2010)
  • Stein et al. (2010)
  • Wu et al. (2010)
  • Davidov and Rappoport (2010)
  • Jones et al. (2012)
  • Zhai et al. (2012)
  • Xiong et al. (2012)
  • Dušek et al. (2012)

Actions

Download

Contribute