The quality of statistical machine translation performed with phrase based approaches can be increased by permuting the words in the source sentences in an order which resembles that of the target language. We propose a class of recurrent neural models which exploit source side dependency syntax features to reorder the words into a target-like order. We evaluate these models on the German-to-English language pair, showing significant improvements over a phrase-based Moses baseline, obtaining a quality similar or superior to that of hand-coded syntactical reordering rules.

Non-projective Dependency-based Pre-Reordering with Recurrent Neural Network for Machine Translation

ATTARDI, GIUSEPPE;MICELI BARONE, ANTONIO VALERIO
2015-01-01

Abstract

The quality of statistical machine translation performed with phrase based approaches can be increased by permuting the words in the source sentences in an order which resembles that of the target language. We propose a class of recurrent neural models which exploit source side dependency syntax features to reorder the words into a target-like order. We evaluate these models on the German-to-English language pair, showing significant improvements over a phrase-based Moses baseline, obtaining a quality similar or superior to that of hand-coded syntactical reordering rules.
2015
978-1-941643-41-9
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/767572
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 10
  • ???jsp.display-item.citation.isi??? 3
social impact